34589 1727204099.12401: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-bGV executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 34589 1727204099.13297: Added group all to inventory 34589 1727204099.13299: Added group ungrouped to inventory 34589 1727204099.13303: Group all now contains ungrouped 34589 1727204099.13309: Examining possible inventory source: /tmp/network-zt6/inventory-rSl.yml 34589 1727204099.49288: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 34589 1727204099.49353: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 34589 1727204099.49378: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 34589 1727204099.49437: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 34589 1727204099.49717: Loaded config def from plugin (inventory/script) 34589 1727204099.49719: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 34589 1727204099.49762: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 34589 1727204099.49851: Loaded config def from plugin (inventory/yaml) 34589 1727204099.49853: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 34589 1727204099.50139: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 34589 1727204099.50968: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 34589 1727204099.50971: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 34589 1727204099.50974: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 34589 1727204099.50981: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 34589 1727204099.50986: Loading data from /tmp/network-zt6/inventory-rSl.yml 34589 1727204099.51044: /tmp/network-zt6/inventory-rSl.yml was not parsable by auto 34589 1727204099.51310: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 34589 1727204099.51348: Loading data from /tmp/network-zt6/inventory-rSl.yml 34589 1727204099.51433: group all already in inventory 34589 1727204099.51440: set inventory_file for managed-node1 34589 1727204099.51444: set inventory_dir for managed-node1 34589 1727204099.51445: Added host managed-node1 to inventory 34589 1727204099.51448: Added host managed-node1 to group all 34589 1727204099.51449: set ansible_host for managed-node1 34589 1727204099.51450: set ansible_ssh_extra_args for managed-node1 34589 1727204099.51453: set inventory_file for managed-node2 34589 1727204099.51455: set inventory_dir for managed-node2 34589 1727204099.51456: Added host managed-node2 to inventory 34589 1727204099.51457: Added host managed-node2 to group all 34589 1727204099.51458: set ansible_host for managed-node2 34589 1727204099.51459: set ansible_ssh_extra_args for managed-node2 34589 1727204099.51462: set inventory_file for managed-node3 34589 1727204099.51464: set inventory_dir for managed-node3 34589 1727204099.51465: Added host managed-node3 to inventory 34589 1727204099.51466: Added host managed-node3 to group all 34589 1727204099.51467: set ansible_host for managed-node3 34589 1727204099.51468: set ansible_ssh_extra_args for managed-node3 34589 1727204099.51470: Reconcile groups and hosts in inventory. 34589 1727204099.51474: Group ungrouped now contains managed-node1 34589 1727204099.51680: Group ungrouped now contains managed-node2 34589 1727204099.51682: Group ungrouped now contains managed-node3 34589 1727204099.51754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 34589 1727204099.51871: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 34589 1727204099.52119: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 34589 1727204099.52145: Loaded config def from plugin (vars/host_group_vars) 34589 1727204099.52148: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 34589 1727204099.52155: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 34589 1727204099.52163: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 34589 1727204099.52205: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 34589 1727204099.52949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204099.53047: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 34589 1727204099.53293: Loaded config def from plugin (connection/local) 34589 1727204099.53296: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 34589 1727204099.54159: Loaded config def from plugin (connection/paramiko_ssh) 34589 1727204099.54162: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 34589 1727204099.55619: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 34589 1727204099.55660: Loaded config def from plugin (connection/psrp) 34589 1727204099.55663: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 34589 1727204099.57289: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 34589 1727204099.57332: Loaded config def from plugin (connection/ssh) 34589 1727204099.57336: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 34589 1727204099.61468: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 34589 1727204099.61507: Loaded config def from plugin (connection/winrm) 34589 1727204099.61510: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 34589 1727204099.61543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 34589 1727204099.61880: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 34589 1727204099.61947: Loaded config def from plugin (shell/cmd) 34589 1727204099.61949: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 34589 1727204099.62221: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 34589 1727204099.62473: Loaded config def from plugin (shell/powershell) 34589 1727204099.62476: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 34589 1727204099.62528: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 34589 1727204099.62858: Loaded config def from plugin (shell/sh) 34589 1727204099.62860: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 34589 1727204099.63164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 34589 1727204099.63311: Loaded config def from plugin (become/runas) 34589 1727204099.63313: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 34589 1727204099.63506: Loaded config def from plugin (become/su) 34589 1727204099.63508: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 34589 1727204099.63661: Loaded config def from plugin (become/sudo) 34589 1727204099.63663: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 34589 1727204099.63699: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml 34589 1727204099.64034: in VariableManager get_vars() 34589 1727204099.64056: done with get_vars() 34589 1727204099.64255: trying /usr/local/lib/python3.12/site-packages/ansible/modules 34589 1727204099.69440: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 34589 1727204099.69550: in VariableManager get_vars() 34589 1727204099.69555: done with get_vars() 34589 1727204099.69557: variable 'playbook_dir' from source: magic vars 34589 1727204099.69558: variable 'ansible_playbook_python' from source: magic vars 34589 1727204099.69559: variable 'ansible_config_file' from source: magic vars 34589 1727204099.69560: variable 'groups' from source: magic vars 34589 1727204099.69560: variable 'omit' from source: magic vars 34589 1727204099.69561: variable 'ansible_version' from source: magic vars 34589 1727204099.69562: variable 'ansible_check_mode' from source: magic vars 34589 1727204099.69562: variable 'ansible_diff_mode' from source: magic vars 34589 1727204099.69563: variable 'ansible_forks' from source: magic vars 34589 1727204099.69564: variable 'ansible_inventory_sources' from source: magic vars 34589 1727204099.69564: variable 'ansible_skip_tags' from source: magic vars 34589 1727204099.69565: variable 'ansible_limit' from source: magic vars 34589 1727204099.69566: variable 'ansible_run_tags' from source: magic vars 34589 1727204099.69566: variable 'ansible_verbosity' from source: magic vars 34589 1727204099.69603: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml 34589 1727204099.70339: in VariableManager get_vars() 34589 1727204099.70354: done with get_vars() 34589 1727204099.70393: in VariableManager get_vars() 34589 1727204099.70414: done with get_vars() 34589 1727204099.70448: in VariableManager get_vars() 34589 1727204099.70459: done with get_vars() 34589 1727204099.70565: in VariableManager get_vars() 34589 1727204099.70581: done with get_vars() 34589 1727204099.70586: variable 'omit' from source: magic vars 34589 1727204099.70605: variable 'omit' from source: magic vars 34589 1727204099.70638: in VariableManager get_vars() 34589 1727204099.70649: done with get_vars() 34589 1727204099.70696: in VariableManager get_vars() 34589 1727204099.70709: done with get_vars() 34589 1727204099.70742: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 34589 1727204099.70956: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 34589 1727204099.71144: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 34589 1727204099.72398: in VariableManager get_vars() 34589 1727204099.72423: done with get_vars() 34589 1727204099.72899: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 34589 1727204099.73034: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34589 1727204099.75067: in VariableManager get_vars() 34589 1727204099.75071: done with get_vars() 34589 1727204099.75074: variable 'playbook_dir' from source: magic vars 34589 1727204099.75077: variable 'ansible_playbook_python' from source: magic vars 34589 1727204099.75078: variable 'ansible_config_file' from source: magic vars 34589 1727204099.75079: variable 'groups' from source: magic vars 34589 1727204099.75080: variable 'omit' from source: magic vars 34589 1727204099.75080: variable 'ansible_version' from source: magic vars 34589 1727204099.75087: variable 'ansible_check_mode' from source: magic vars 34589 1727204099.75089: variable 'ansible_diff_mode' from source: magic vars 34589 1727204099.75090: variable 'ansible_forks' from source: magic vars 34589 1727204099.75090: variable 'ansible_inventory_sources' from source: magic vars 34589 1727204099.75091: variable 'ansible_skip_tags' from source: magic vars 34589 1727204099.75092: variable 'ansible_limit' from source: magic vars 34589 1727204099.75093: variable 'ansible_run_tags' from source: magic vars 34589 1727204099.75093: variable 'ansible_verbosity' from source: magic vars 34589 1727204099.75137: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 34589 1727204099.75228: in VariableManager get_vars() 34589 1727204099.75239: done with get_vars() 34589 1727204099.75280: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 34589 1727204099.75398: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 34589 1727204099.75468: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 34589 1727204099.75941: in VariableManager get_vars() 34589 1727204099.75962: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34589 1727204099.77598: in VariableManager get_vars() 34589 1727204099.77601: done with get_vars() 34589 1727204099.77603: variable 'playbook_dir' from source: magic vars 34589 1727204099.77604: variable 'ansible_playbook_python' from source: magic vars 34589 1727204099.77605: variable 'ansible_config_file' from source: magic vars 34589 1727204099.77608: variable 'groups' from source: magic vars 34589 1727204099.77609: variable 'omit' from source: magic vars 34589 1727204099.77609: variable 'ansible_version' from source: magic vars 34589 1727204099.77610: variable 'ansible_check_mode' from source: magic vars 34589 1727204099.77611: variable 'ansible_diff_mode' from source: magic vars 34589 1727204099.77611: variable 'ansible_forks' from source: magic vars 34589 1727204099.77612: variable 'ansible_inventory_sources' from source: magic vars 34589 1727204099.77613: variable 'ansible_skip_tags' from source: magic vars 34589 1727204099.77613: variable 'ansible_limit' from source: magic vars 34589 1727204099.77614: variable 'ansible_run_tags' from source: magic vars 34589 1727204099.77615: variable 'ansible_verbosity' from source: magic vars 34589 1727204099.77650: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 34589 1727204099.77730: in VariableManager get_vars() 34589 1727204099.77743: done with get_vars() 34589 1727204099.77785: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 34589 1727204099.77901: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 34589 1727204099.82149: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 34589 1727204099.82651: in VariableManager get_vars() 34589 1727204099.82673: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34589 1727204099.84272: in VariableManager get_vars() 34589 1727204099.84289: done with get_vars() 34589 1727204099.84326: in VariableManager get_vars() 34589 1727204099.84339: done with get_vars() 34589 1727204099.84373: in VariableManager get_vars() 34589 1727204099.84399: done with get_vars() 34589 1727204099.84434: in VariableManager get_vars() 34589 1727204099.84445: done with get_vars() 34589 1727204099.84504: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 34589 1727204099.84518: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 34589 1727204099.84744: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 34589 1727204099.84906: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 34589 1727204099.84909: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-bGV/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 34589 1727204099.84937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 34589 1727204099.84960: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 34589 1727204099.85113: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 34589 1727204099.85172: Loaded config def from plugin (callback/default) 34589 1727204099.85177: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 34589 1727204099.86253: Loaded config def from plugin (callback/junit) 34589 1727204099.86255: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 34589 1727204099.86300: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 34589 1727204099.86363: Loaded config def from plugin (callback/minimal) 34589 1727204099.86366: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 34589 1727204099.86407: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 34589 1727204099.86465: Loaded config def from plugin (callback/tree) 34589 1727204099.86467: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 34589 1727204099.86578: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 34589 1727204099.86580: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-bGV/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ipv6_disabled_nm.yml ******************************************* 5 plays in /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml 34589 1727204099.86602: in VariableManager get_vars() 34589 1727204099.86614: done with get_vars() 34589 1727204099.86619: in VariableManager get_vars() 34589 1727204099.86626: done with get_vars() 34589 1727204099.86629: variable 'omit' from source: magic vars 34589 1727204099.86660: in VariableManager get_vars() 34589 1727204099.86671: done with get_vars() 34589 1727204099.86690: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ipv6_disabled.yml' with nm as provider] **** 34589 1727204099.87192: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 34589 1727204099.87260: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 34589 1727204099.87294: getting the remaining hosts for this loop 34589 1727204099.87295: done getting the remaining hosts for this loop 34589 1727204099.87298: getting the next task for host managed-node1 34589 1727204099.87301: done getting next task for host managed-node1 34589 1727204099.87303: ^ task is: TASK: Gathering Facts 34589 1727204099.87304: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204099.87306: getting variables 34589 1727204099.87307: in VariableManager get_vars() 34589 1727204099.87316: Calling all_inventory to load vars for managed-node1 34589 1727204099.87318: Calling groups_inventory to load vars for managed-node1 34589 1727204099.87321: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204099.87331: Calling all_plugins_play to load vars for managed-node1 34589 1727204099.87339: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204099.87342: Calling groups_plugins_play to load vars for managed-node1 34589 1727204099.87370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204099.87426: done with get_vars() 34589 1727204099.87432: done getting variables 34589 1727204099.87500: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml:6 Tuesday 24 September 2024 14:54:59 -0400 (0:00:00.010) 0:00:00.010 ***** 34589 1727204099.87521: entering _queue_task() for managed-node1/gather_facts 34589 1727204099.87522: Creating lock for gather_facts 34589 1727204099.87938: worker is 1 (out of 1 available) 34589 1727204099.87949: exiting _queue_task() for managed-node1/gather_facts 34589 1727204099.87962: done queuing things up, now waiting for results queue to drain 34589 1727204099.87963: waiting for pending results... 34589 1727204099.88572: running TaskExecutor() for managed-node1/TASK: Gathering Facts 34589 1727204099.88655: in run() - task 028d2410-947f-a9c6-cddc-0000000000a3 34589 1727204099.88768: variable 'ansible_search_path' from source: unknown 34589 1727204099.88816: calling self._execute() 34589 1727204099.88893: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204099.88908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204099.88922: variable 'omit' from source: magic vars 34589 1727204099.89033: variable 'omit' from source: magic vars 34589 1727204099.89074: variable 'omit' from source: magic vars 34589 1727204099.89125: variable 'omit' from source: magic vars 34589 1727204099.89226: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204099.89230: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204099.89256: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204099.89284: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204099.89302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204099.89338: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204099.89353: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204099.89402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204099.89513: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204099.89590: Set connection var ansible_shell_executable to /bin/sh 34589 1727204099.89593: Set connection var ansible_timeout to 10 34589 1727204099.89596: Set connection var ansible_shell_type to sh 34589 1727204099.89599: Set connection var ansible_connection to ssh 34589 1727204099.89601: Set connection var ansible_pipelining to False 34589 1727204099.89604: variable 'ansible_shell_executable' from source: unknown 34589 1727204099.89605: variable 'ansible_connection' from source: unknown 34589 1727204099.89607: variable 'ansible_module_compression' from source: unknown 34589 1727204099.89610: variable 'ansible_shell_type' from source: unknown 34589 1727204099.89612: variable 'ansible_shell_executable' from source: unknown 34589 1727204099.89614: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204099.89616: variable 'ansible_pipelining' from source: unknown 34589 1727204099.89618: variable 'ansible_timeout' from source: unknown 34589 1727204099.89680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204099.89967: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204099.89986: variable 'omit' from source: magic vars 34589 1727204099.89995: starting attempt loop 34589 1727204099.90001: running the handler 34589 1727204099.90020: variable 'ansible_facts' from source: unknown 34589 1727204099.90046: _low_level_execute_command(): starting 34589 1727204099.90067: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204099.90985: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204099.90989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204099.90991: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204099.90994: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204099.91103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204099.92985: stdout chunk (state=3): >>>/root <<< 34589 1727204099.93332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204099.93336: stdout chunk (state=3): >>><<< 34589 1727204099.93338: stderr chunk (state=3): >>><<< 34589 1727204099.93342: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204099.93344: _low_level_execute_command(): starting 34589 1727204099.93347: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204099.9323637-34834-86768102220879 `" && echo ansible-tmp-1727204099.9323637-34834-86768102220879="` echo /root/.ansible/tmp/ansible-tmp-1727204099.9323637-34834-86768102220879 `" ) && sleep 0' 34589 1727204099.94285: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204099.94585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204099.94609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204099.94725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204099.96895: stdout chunk (state=3): >>>ansible-tmp-1727204099.9323637-34834-86768102220879=/root/.ansible/tmp/ansible-tmp-1727204099.9323637-34834-86768102220879 <<< 34589 1727204099.96973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204099.97066: stderr chunk (state=3): >>><<< 34589 1727204099.97088: stdout chunk (state=3): >>><<< 34589 1727204099.97109: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204099.9323637-34834-86768102220879=/root/.ansible/tmp/ansible-tmp-1727204099.9323637-34834-86768102220879 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204099.97149: variable 'ansible_module_compression' from source: unknown 34589 1727204099.97491: ANSIBALLZ: Using generic lock for ansible.legacy.setup 34589 1727204099.97495: ANSIBALLZ: Acquiring lock 34589 1727204099.97497: ANSIBALLZ: Lock acquired: 140222054199088 34589 1727204099.97500: ANSIBALLZ: Creating module 34589 1727204100.32904: ANSIBALLZ: Writing module into payload 34589 1727204100.33058: ANSIBALLZ: Writing module 34589 1727204100.33100: ANSIBALLZ: Renaming module 34589 1727204100.33117: ANSIBALLZ: Done creating module 34589 1727204100.33144: variable 'ansible_facts' from source: unknown 34589 1727204100.33163: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204100.33182: _low_level_execute_command(): starting 34589 1727204100.33194: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 34589 1727204100.33856: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204100.34005: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204100.34116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204100.34191: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204100.34204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204100.34324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204100.36099: stdout chunk (state=3): >>>PLATFORM <<< 34589 1727204100.36187: stdout chunk (state=3): >>>Linux <<< 34589 1727204100.36241: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 34589 1727204100.36398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204100.36408: stdout chunk (state=3): >>><<< 34589 1727204100.36424: stderr chunk (state=3): >>><<< 34589 1727204100.36451: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204100.36468 [managed-node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 34589 1727204100.36525: _low_level_execute_command(): starting 34589 1727204100.36538: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 34589 1727204100.36794: Sending initial data 34589 1727204100.36797: Sent initial data (1181 bytes) 34589 1727204100.37594: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204100.37686: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204100.37926: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204100.37959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204100.41699: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 34589 1727204100.42675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204100.42686: stdout chunk (state=3): >>><<< 34589 1727204100.42689: stderr chunk (state=3): >>><<< 34589 1727204100.42692: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204100.42846: variable 'ansible_facts' from source: unknown 34589 1727204100.43185: variable 'ansible_facts' from source: unknown 34589 1727204100.43189: variable 'ansible_module_compression' from source: unknown 34589 1727204100.43191: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 34589 1727204100.43193: variable 'ansible_facts' from source: unknown 34589 1727204100.43572: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204099.9323637-34834-86768102220879/AnsiballZ_setup.py 34589 1727204100.44112: Sending initial data 34589 1727204100.44290: Sent initial data (153 bytes) 34589 1727204100.45752: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204100.45756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34589 1727204100.45758: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204100.45760: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204100.46097: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204100.46169: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204100.46235: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204100.48089: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204100.48159: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204100.48316: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmptqh6sj_r /root/.ansible/tmp/ansible-tmp-1727204099.9323637-34834-86768102220879/AnsiballZ_setup.py <<< 34589 1727204100.48319: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204099.9323637-34834-86768102220879/AnsiballZ_setup.py" <<< 34589 1727204100.48530: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmptqh6sj_r" to remote "/root/.ansible/tmp/ansible-tmp-1727204099.9323637-34834-86768102220879/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204099.9323637-34834-86768102220879/AnsiballZ_setup.py" <<< 34589 1727204100.51583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204100.51587: stdout chunk (state=3): >>><<< 34589 1727204100.51590: stderr chunk (state=3): >>><<< 34589 1727204100.51592: done transferring module to remote 34589 1727204100.51594: _low_level_execute_command(): starting 34589 1727204100.51596: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204099.9323637-34834-86768102220879/ /root/.ansible/tmp/ansible-tmp-1727204099.9323637-34834-86768102220879/AnsiballZ_setup.py && sleep 0' 34589 1727204100.52831: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204100.52892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204100.53038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204100.53064: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204100.53094: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204100.53207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204100.55202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204100.55310: stderr chunk (state=3): >>><<< 34589 1727204100.55385: stdout chunk (state=3): >>><<< 34589 1727204100.55388: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204100.55391: _low_level_execute_command(): starting 34589 1727204100.55393: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204099.9323637-34834-86768102220879/AnsiballZ_setup.py && sleep 0' 34589 1727204100.56395: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204100.56443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204100.56472: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204100.56689: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204100.59008: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 34589 1727204100.59034: stdout chunk (state=3): >>>import _imp # builtin <<< 34589 1727204100.59060: stdout chunk (state=3): >>>import '_thread' # <<< 34589 1727204100.59071: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 34589 1727204100.59144: stdout chunk (state=3): >>>import '_io' # <<< 34589 1727204100.59147: stdout chunk (state=3): >>>import 'marshal' # <<< 34589 1727204100.59179: stdout chunk (state=3): >>>import 'posix' # <<< 34589 1727204100.59218: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 34589 1727204100.59250: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 34589 1727204100.59305: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204100.59330: stdout chunk (state=3): >>>import '_codecs' # <<< 34589 1727204100.59347: stdout chunk (state=3): >>>import 'codecs' # <<< 34589 1727204100.59385: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 34589 1727204100.59419: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 34589 1727204100.59428: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fce184d0> <<< 34589 1727204100.59430: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcde7b30> <<< 34589 1727204100.59460: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fce1aa50> <<< 34589 1727204100.59516: stdout chunk (state=3): >>>import '_signal' # <<< 34589 1727204100.59519: stdout chunk (state=3): >>>import '_abc' # <<< 34589 1727204100.59529: stdout chunk (state=3): >>>import 'abc' # import 'io' # <<< 34589 1727204100.59563: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 34589 1727204100.59657: stdout chunk (state=3): >>>import '_collections_abc' # <<< 34589 1727204100.59692: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 34589 1727204100.59743: stdout chunk (state=3): >>>import 'os' # <<< 34589 1727204100.59780: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 34589 1727204100.59783: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 34589 1727204100.60028: stdout chunk (state=3): >>>Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 34589 1727204100.60032: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcc2d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcc2e060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 34589 1727204100.60370: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 34589 1727204100.60389: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 34589 1727204100.60403: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 34589 1727204100.60418: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204100.60438: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 34589 1727204100.60491: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 34589 1727204100.60509: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 34589 1727204100.60536: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 34589 1727204100.60561: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcc6be90> <<< 34589 1727204100.60582: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 34589 1727204100.60609: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 34589 1727204100.60640: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcc6bf50> <<< 34589 1727204100.60654: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 34589 1727204100.60664: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 34589 1727204100.60690: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 34589 1727204100.60895: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204100.60925: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcca3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcca3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcc83b60> import '_functools' # <<< 34589 1727204100.60938: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcc81280> <<< 34589 1727204100.61048: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcc69040> <<< 34589 1727204100.61080: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 34589 1727204100.61126: stdout chunk (state=3): >>>import '_sre' # <<< 34589 1727204100.61140: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 34589 1727204100.61171: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 34589 1727204100.61386: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fccc7800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fccc6420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcc82150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fccc4c20> <<< 34589 1727204100.61400: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fccf8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcc682c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 34589 1727204100.61481: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fccf8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fccf8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fccf8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcc66de0> <<< 34589 1727204100.61798: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 34589 1727204100.61803: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fccf9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fccf9370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fccfa540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcd10740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fcd11e20> <<< 34589 1727204100.61903: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 34589 1727204100.61932: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcd12cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fcd132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcd12210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 34589 1727204100.61945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 34589 1727204100.61966: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204100.61995: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fcd13d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcd134a0> <<< 34589 1727204100.62182: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fccfa4b0> <<< 34589 1727204100.62232: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fca2fc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fca58770> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fca584d0> <<< 34589 1727204100.62270: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fca586b0> <<< 34589 1727204100.62518: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 34589 1727204100.62522: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fca59070> <<< 34589 1727204100.62656: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fca599a0> <<< 34589 1727204100.62713: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fca58920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fca2ddf0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 34589 1727204100.62738: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 34589 1727204100.62765: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fca5ad80> <<< 34589 1727204100.62800: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fca59ac0> <<< 34589 1727204100.62856: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fccfac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 34589 1727204100.62911: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204100.62936: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 34589 1727204100.63012: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 34589 1727204100.63041: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fca87110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 34589 1727204100.63079: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 34589 1727204100.63208: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 34589 1727204100.63223: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcaab4d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 34589 1727204100.63298: stdout chunk (state=3): >>>import 'ntpath' # <<< 34589 1727204100.63513: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcb08230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 34589 1727204100.63516: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcb0a990> <<< 34589 1727204100.63613: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcb08350> <<< 34589 1727204100.63642: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcad1220> <<< 34589 1727204100.63668: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc919340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcaaa2d0> <<< 34589 1727204100.63694: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fca5bce0> <<< 34589 1727204100.63860: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 34589 1727204100.63910: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f55fcaaa8d0> <<< 34589 1727204100.64413: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_1q0e4t1g/ansible_ansible.legacy.setup_payload.zip' <<< 34589 1727204100.64449: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.64553: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.64595: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 34589 1727204100.64640: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 34589 1727204100.64757: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 34589 1727204100.64779: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc97f0b0> import '_typing' # <<< 34589 1727204100.64957: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc95dfa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc95d130> <<< 34589 1727204100.64982: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.65011: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 34589 1727204100.65061: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34589 1727204100.65072: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 34589 1727204100.66561: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.67895: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc97d3a0> <<< 34589 1727204100.67954: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 34589 1727204100.67958: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 34589 1727204100.67981: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc9b2a20> <<< 34589 1727204100.68081: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc9b27b0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc9b20c0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 34589 1727204100.68504: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc9b2600> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fce1a9c0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc9b3740> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc9b3950> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc9b3e60> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc329be0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc32b800> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 34589 1727204100.68547: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc330200> <<< 34589 1727204100.68550: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 34589 1727204100.68622: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc331100> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 34589 1727204100.68668: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 34589 1727204100.68697: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 34589 1727204100.68743: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc333e30> <<< 34589 1727204100.68822: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fccc5ee0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc332120> <<< 34589 1727204100.68839: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 34589 1727204100.68862: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 34589 1727204100.68940: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 34589 1727204100.68943: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 34589 1727204100.69283: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc337ce0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc3367b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc336510> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 34589 1727204100.69287: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc336a80> <<< 34589 1727204100.69310: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc332630> <<< 34589 1727204100.69371: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc37bf80> <<< 34589 1727204100.69398: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc37c0e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 34589 1727204100.69421: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 34589 1727204100.69440: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 34589 1727204100.69493: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc37dbe0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc37d9a0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 34589 1727204100.69599: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 34589 1727204100.69653: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc3801a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc37e2d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204100.69712: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 34589 1727204100.69903: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc383950> <<< 34589 1727204100.69940: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc380350> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc384770> <<< 34589 1727204100.69973: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc384830> <<< 34589 1727204100.70024: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204100.70174: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc384b60> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc37c2f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204100.70179: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc2102f0> <<< 34589 1727204100.70342: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204100.70424: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc2115b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc386a80> <<< 34589 1727204100.70427: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc387e30> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc3866f0> # zipimport: zlib available # zipimport: zlib available <<< 34589 1727204100.70685: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 34589 1727204100.70689: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 34589 1727204100.70691: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 34589 1727204100.70744: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.70820: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.70935: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.71521: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.72102: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 34589 1727204100.72133: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 34589 1727204100.72210: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 34589 1727204100.72337: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc215940> <<< 34589 1727204100.72366: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc216690> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc2116d0> <<< 34589 1727204100.72477: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 34589 1727204100.72481: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 34589 1727204100.72618: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.72848: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc2165d0> # zipimport: zlib available <<< 34589 1727204100.73295: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.73783: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.73858: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.73965: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 34589 1727204100.74053: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.74068: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 34589 1727204100.74095: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.74190: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available <<< 34589 1727204100.74269: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available <<< 34589 1727204100.74309: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 34589 1727204100.74323: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.74556: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.74904: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 34589 1727204100.74949: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc2179b0> <<< 34589 1727204100.74972: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.75038: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.75126: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 34589 1727204100.75196: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 34589 1727204100.75241: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # <<< 34589 1727204100.75254: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.75351: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34589 1727204100.75396: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.75469: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 34589 1727204100.75518: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204100.75624: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc222150> <<< 34589 1727204100.75680: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc21f200> <<< 34589 1727204100.75766: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 34589 1727204100.75785: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.75830: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.75856: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.75913: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204100.76014: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 34589 1727204100.76017: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 34589 1727204100.76042: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 34589 1727204100.76066: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 34589 1727204100.76181: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 34589 1727204100.76208: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc30aa80> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc3fe750> <<< 34589 1727204100.76278: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc221fa0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc217290> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 34589 1727204100.76335: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34589 1727204100.76351: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 34589 1727204100.76517: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available <<< 34589 1727204100.76600: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34589 1727204100.76708: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34589 1727204100.76763: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34589 1727204100.76838: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 34589 1727204100.76857: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.76874: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.76943: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.77183: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.77215: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available <<< 34589 1727204100.77383: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.77429: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.77473: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204100.77499: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 34589 1727204100.77533: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 34589 1727204100.77549: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 34589 1727204100.77595: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc2b64e0> <<< 34589 1727204100.77650: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 34589 1727204100.77741: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 34589 1727204100.77797: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbe6c1a0> <<< 34589 1727204100.77800: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204100.77802: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fbe6c500> <<< 34589 1727204100.77928: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc2a32f0> <<< 34589 1727204100.77933: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc2b7020> <<< 34589 1727204100.77935: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc2b4bf0> <<< 34589 1727204100.77937: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc2b4830> <<< 34589 1727204100.77939: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 34589 1727204100.78046: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 34589 1727204100.78049: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 34589 1727204100.78052: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 34589 1727204100.78054: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 34589 1727204100.78195: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fbe6f470> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbe6ed20> <<< 34589 1727204100.78213: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fbe6ef00> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbe6e150> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 34589 1727204100.78317: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 34589 1727204100.78334: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbe6f530> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 34589 1727204100.78373: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 34589 1727204100.78474: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fbed2030> <<< 34589 1727204100.78481: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbe6fc80> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc2b48c0> import 'ansible.module_utils.facts.timeout' # <<< 34589 1727204100.78533: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 34589 1727204100.78666: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.78670: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 34589 1727204100.78754: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.78758: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.78807: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available <<< 34589 1727204100.78854: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available <<< 34589 1727204100.78886: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 34589 1727204100.78960: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.78996: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.79010: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 34589 1727204100.79053: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.79184: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available <<< 34589 1727204100.79214: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.79278: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.79390: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 34589 1727204100.79411: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.79974: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.80314: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 34589 1727204100.80337: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.80373: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.80434: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.80465: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.80520: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 34589 1727204100.80645: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 34589 1727204100.80649: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.80747: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 34589 1727204100.80787: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.80791: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.80794: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 34589 1727204100.80796: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.80985: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 34589 1727204100.81027: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.81046: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 34589 1727204100.81077: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbed22a0> <<< 34589 1727204100.81091: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 34589 1727204100.81121: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 34589 1727204100.81296: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbed2e70> import 'ansible.module_utils.facts.system.local' # <<< 34589 1727204100.81299: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.81322: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.81844: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 34589 1727204100.81881: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 34589 1727204100.81958: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204100.82027: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fbf0a420> <<< 34589 1727204100.82236: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbefa120> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 34589 1727204100.82303: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.82349: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 34589 1727204100.82367: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.82453: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.82537: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.82647: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.82802: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 34589 1727204100.82819: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 34589 1727204100.82891: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.82911: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 34589 1727204100.82940: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.83108: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fbf1df10> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbefb2f0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 34589 1727204100.83147: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.83281: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 34589 1727204100.83284: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.83356: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.83566: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 34589 1727204100.83570: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.83624: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.83722: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.83766: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.83800: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 34589 1727204100.83899: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 34589 1727204100.84015: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.84157: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 34589 1727204100.84174: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.84294: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.84417: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 34589 1727204100.84498: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 34589 1727204100.85080: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.85631: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 34589 1727204100.85754: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.85863: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 34589 1727204100.86133: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.86137: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 34589 1727204100.86615: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.86618: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 34589 1727204100.86620: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.86626: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 34589 1727204100.86629: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.86631: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.86633: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 34589 1727204100.86635: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.86641: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.86742: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.86953: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.87163: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 34589 1727204100.87303: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available <<< 34589 1727204100.87314: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 34589 1727204100.87344: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.87397: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.87471: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 34589 1727204100.87499: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.87715: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available <<< 34589 1727204100.87769: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 34589 1727204100.87782: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.88180: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.88320: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 34589 1727204100.88336: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.88387: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.88450: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 34589 1727204100.88461: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.88677: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.88681: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # <<< 34589 1727204100.88684: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.88686: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.88688: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 34589 1727204100.88889: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.88892: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.88983: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 34589 1727204100.88986: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.89005: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 34589 1727204100.89024: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34589 1727204100.89044: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.89142: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34589 1727204100.89405: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 34589 1727204100.89625: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.89903: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available <<< 34589 1727204100.89928: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 34589 1727204100.89976: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.90031: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 34589 1727204100.90123: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.90302: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available <<< 34589 1727204100.90396: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 34589 1727204100.90480: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204100.91799: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 34589 1727204100.91827: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fbd1b7a0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbd19df0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbd19790> <<< 34589 1727204101.07617: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 34589 1727204101.07652: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbd61430> <<< 34589 1727204101.07688: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 34589 1727204101.07711: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 34589 1727204101.07734: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbd62270> <<< 34589 1727204101.07794: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 34589 1727204101.07798: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204101.07835: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py <<< 34589 1727204101.07857: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbf108f0> <<< 34589 1727204101.07906: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbf103e0> <<< 34589 1727204101.08148: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 34589 1727204101.28394: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_local": {}, "ansible_loadavg": {"1m": 0.7080078125, "5m": 0.53271484375, "15m": 0.27880859375}, "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "00", "epoch": "1727204100", "epoch_int": "1727204100", "date": "2024-09-24", "time": "14:55:00", "iso8601_micro": "2024-09-24T18:55:00.918999Z", "iso8601": "2024-09-24T18:55:00Z", "iso8601_basic": "20240924T145500918999", "iso8601_basic_short": "20240924T145500", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2909, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 622, "free": 2909}, "nocache": {"free": 3267, "used": 264}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 692, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785624576, "block_size": 4096, "block_total": 65519099, "block_available": 63912506, "block_used": 1606593, "inode_total": 131070960, "inode_available": 131027260, "inode_used": 43700, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 34589 1727204101.29153: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport <<< 34589 1727204101.29214: stdout chunk (state=3): >>># cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache<<< 34589 1727204101.29285: stdout chunk (state=3): >>> # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast<<< 34589 1727204101.29412: stdout chunk (state=3): >>> # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin <<< 34589 1727204101.29483: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux<<< 34589 1727204101.29502: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env <<< 34589 1727204101.29611: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd<<< 34589 1727204101.29615: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 34589 1727204101.30200: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 34589 1727204101.30246: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 34589 1727204101.30262: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder <<< 34589 1727204101.30355: stdout chunk (state=3): >>># destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale <<< 34589 1727204101.30359: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 34589 1727204101.30447: stdout chunk (state=3): >>># destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 34589 1727204101.30511: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue <<< 34589 1727204101.30539: stdout chunk (state=3): >>># destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 34589 1727204101.30697: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd <<< 34589 1727204101.30701: stdout chunk (state=3): >>># destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 34589 1727204101.30733: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep <<< 34589 1727204101.30800: stdout chunk (state=3): >>># cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 34589 1727204101.30818: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib <<< 34589 1727204101.30894: stdout chunk (state=3): >>># cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc <<< 34589 1727204101.31006: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 34589 1727204101.31009: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 34589 1727204101.31236: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 34589 1727204101.31260: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 34589 1727204101.31407: stdout chunk (state=3): >>># destroy _typing <<< 34589 1727204101.31410: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 34589 1727204101.31435: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 34589 1727204101.31455: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 34589 1727204101.31486: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 34589 1727204101.31524: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re <<< 34589 1727204101.31782: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins <<< 34589 1727204101.31797: stdout chunk (state=3): >>># destroy _thread # clear sys.audit hooks <<< 34589 1727204101.32078: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204101.32081: stdout chunk (state=3): >>><<< 34589 1727204101.32083: stderr chunk (state=3): >>><<< 34589 1727204101.32605: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fce184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcde7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fce1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcc2d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcc2e060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcc6be90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcc6bf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcca3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcca3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcc83b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcc81280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcc69040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fccc7800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fccc6420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcc82150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fccc4c20> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fccf8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcc682c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fccf8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fccf8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fccf8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcc66de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fccf9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fccf9370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fccfa540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcd10740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fcd11e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcd12cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fcd132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcd12210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fcd13d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcd134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fccfa4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fca2fc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fca58770> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fca584d0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fca586b0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fca59070> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fca599a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fca58920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fca2ddf0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fca5ad80> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fca59ac0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fccfac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fca87110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcaab4d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcb08230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcb0a990> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcb08350> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcad1220> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc919340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fcaaa2d0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fca5bce0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f55fcaaa8d0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_1q0e4t1g/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc97f0b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc95dfa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc95d130> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc97d3a0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc9b2a20> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc9b27b0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc9b20c0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc9b2600> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fce1a9c0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc9b3740> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc9b3950> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc9b3e60> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc329be0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc32b800> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc330200> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc331100> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc333e30> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fccc5ee0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc332120> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc337ce0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc3367b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc336510> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc336a80> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc332630> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc37bf80> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc37c0e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc37dbe0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc37d9a0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc3801a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc37e2d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc383950> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc380350> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc384770> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc384830> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc384b60> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc37c2f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc2102f0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc2115b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc386a80> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc387e30> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc3866f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc215940> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc216690> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc2116d0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc2165d0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc2179b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fc222150> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc21f200> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc30aa80> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc3fe750> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc221fa0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc217290> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc2b64e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbe6c1a0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fbe6c500> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc2a32f0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc2b7020> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc2b4bf0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc2b4830> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fbe6f470> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbe6ed20> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fbe6ef00> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbe6e150> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbe6f530> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fbed2030> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbe6fc80> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fc2b48c0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbed22a0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbed2e70> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fbf0a420> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbefa120> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fbf1df10> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbefb2f0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55fbd1b7a0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbd19df0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbd19790> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbd61430> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbd62270> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbf108f0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55fbf103e0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_local": {}, "ansible_loadavg": {"1m": 0.7080078125, "5m": 0.53271484375, "15m": 0.27880859375}, "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "00", "epoch": "1727204100", "epoch_int": "1727204100", "date": "2024-09-24", "time": "14:55:00", "iso8601_micro": "2024-09-24T18:55:00.918999Z", "iso8601": "2024-09-24T18:55:00Z", "iso8601_basic": "20240924T145500918999", "iso8601_basic_short": "20240924T145500", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2909, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 622, "free": 2909}, "nocache": {"free": 3267, "used": 264}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 692, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785624576, "block_size": 4096, "block_total": 65519099, "block_available": 63912506, "block_used": 1606593, "inode_total": 131070960, "inode_available": 131027260, "inode_used": 43700, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 34589 1727204101.37290: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204099.9323637-34834-86768102220879/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204101.37312: _low_level_execute_command(): starting 34589 1727204101.37316: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204099.9323637-34834-86768102220879/ > /dev/null 2>&1 && sleep 0' 34589 1727204101.38651: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204101.38656: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204101.38658: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204101.38783: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204101.39182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204101.41154: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204101.41158: stderr chunk (state=3): >>><<< 34589 1727204101.41161: stdout chunk (state=3): >>><<< 34589 1727204101.41485: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204101.41489: handler run complete 34589 1727204101.41526: variable 'ansible_facts' from source: unknown 34589 1727204101.41884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204101.42390: variable 'ansible_facts' from source: unknown 34589 1727204101.42589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204101.42819: attempt loop complete, returning result 34589 1727204101.42823: _execute() done 34589 1727204101.42825: dumping result to json 34589 1727204101.42851: done dumping result, returning 34589 1727204101.42859: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [028d2410-947f-a9c6-cddc-0000000000a3] 34589 1727204101.42862: sending task result for task 028d2410-947f-a9c6-cddc-0000000000a3 34589 1727204101.43989: done sending task result for task 028d2410-947f-a9c6-cddc-0000000000a3 34589 1727204101.43992: WORKER PROCESS EXITING ok: [managed-node1] 34589 1727204101.44681: no more pending results, returning what we have 34589 1727204101.44685: results queue empty 34589 1727204101.44686: checking for any_errors_fatal 34589 1727204101.44687: done checking for any_errors_fatal 34589 1727204101.44688: checking for max_fail_percentage 34589 1727204101.44690: done checking for max_fail_percentage 34589 1727204101.44691: checking to see if all hosts have failed and the running result is not ok 34589 1727204101.44692: done checking to see if all hosts have failed 34589 1727204101.44693: getting the remaining hosts for this loop 34589 1727204101.44694: done getting the remaining hosts for this loop 34589 1727204101.44699: getting the next task for host managed-node1 34589 1727204101.44707: done getting next task for host managed-node1 34589 1727204101.44709: ^ task is: TASK: meta (flush_handlers) 34589 1727204101.44711: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204101.44716: getting variables 34589 1727204101.44717: in VariableManager get_vars() 34589 1727204101.44740: Calling all_inventory to load vars for managed-node1 34589 1727204101.44743: Calling groups_inventory to load vars for managed-node1 34589 1727204101.44746: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204101.44756: Calling all_plugins_play to load vars for managed-node1 34589 1727204101.44759: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204101.44762: Calling groups_plugins_play to load vars for managed-node1 34589 1727204101.45358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204101.45820: done with get_vars() 34589 1727204101.45948: done getting variables 34589 1727204101.46019: in VariableManager get_vars() 34589 1727204101.46030: Calling all_inventory to load vars for managed-node1 34589 1727204101.46033: Calling groups_inventory to load vars for managed-node1 34589 1727204101.46035: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204101.46040: Calling all_plugins_play to load vars for managed-node1 34589 1727204101.46042: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204101.46084: Calling groups_plugins_play to load vars for managed-node1 34589 1727204101.46340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204101.46738: done with get_vars() 34589 1727204101.46757: done queuing things up, now waiting for results queue to drain 34589 1727204101.46759: results queue empty 34589 1727204101.46760: checking for any_errors_fatal 34589 1727204101.46763: done checking for any_errors_fatal 34589 1727204101.46764: checking for max_fail_percentage 34589 1727204101.46765: done checking for max_fail_percentage 34589 1727204101.46766: checking to see if all hosts have failed and the running result is not ok 34589 1727204101.46767: done checking to see if all hosts have failed 34589 1727204101.46773: getting the remaining hosts for this loop 34589 1727204101.46774: done getting the remaining hosts for this loop 34589 1727204101.46779: getting the next task for host managed-node1 34589 1727204101.46783: done getting next task for host managed-node1 34589 1727204101.46786: ^ task is: TASK: Include the task 'el_repo_setup.yml' 34589 1727204101.46787: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204101.46789: getting variables 34589 1727204101.46790: in VariableManager get_vars() 34589 1727204101.46799: Calling all_inventory to load vars for managed-node1 34589 1727204101.46801: Calling groups_inventory to load vars for managed-node1 34589 1727204101.46805: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204101.46810: Calling all_plugins_play to load vars for managed-node1 34589 1727204101.46812: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204101.46814: Calling groups_plugins_play to load vars for managed-node1 34589 1727204101.47188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204101.47613: done with get_vars() 34589 1727204101.47624: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml:11 Tuesday 24 September 2024 14:55:01 -0400 (0:00:01.603) 0:00:01.613 ***** 34589 1727204101.47867: entering _queue_task() for managed-node1/include_tasks 34589 1727204101.47869: Creating lock for include_tasks 34589 1727204101.48551: worker is 1 (out of 1 available) 34589 1727204101.48565: exiting _queue_task() for managed-node1/include_tasks 34589 1727204101.48942: done queuing things up, now waiting for results queue to drain 34589 1727204101.48945: waiting for pending results... 34589 1727204101.49397: running TaskExecutor() for managed-node1/TASK: Include the task 'el_repo_setup.yml' 34589 1727204101.49411: in run() - task 028d2410-947f-a9c6-cddc-000000000006 34589 1727204101.49414: variable 'ansible_search_path' from source: unknown 34589 1727204101.49417: calling self._execute() 34589 1727204101.49544: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204101.49557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204101.49572: variable 'omit' from source: magic vars 34589 1727204101.49779: _execute() done 34589 1727204101.49848: dumping result to json 34589 1727204101.49855: done dumping result, returning 34589 1727204101.49867: done running TaskExecutor() for managed-node1/TASK: Include the task 'el_repo_setup.yml' [028d2410-947f-a9c6-cddc-000000000006] 34589 1727204101.50060: sending task result for task 028d2410-947f-a9c6-cddc-000000000006 34589 1727204101.50137: done sending task result for task 028d2410-947f-a9c6-cddc-000000000006 34589 1727204101.50142: WORKER PROCESS EXITING 34589 1727204101.50214: no more pending results, returning what we have 34589 1727204101.50219: in VariableManager get_vars() 34589 1727204101.50254: Calling all_inventory to load vars for managed-node1 34589 1727204101.50256: Calling groups_inventory to load vars for managed-node1 34589 1727204101.50260: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204101.50274: Calling all_plugins_play to load vars for managed-node1 34589 1727204101.50279: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204101.50282: Calling groups_plugins_play to load vars for managed-node1 34589 1727204101.50790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204101.51384: done with get_vars() 34589 1727204101.51392: variable 'ansible_search_path' from source: unknown 34589 1727204101.51408: we have included files to process 34589 1727204101.51409: generating all_blocks data 34589 1727204101.51410: done generating all_blocks data 34589 1727204101.51411: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 34589 1727204101.51413: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 34589 1727204101.51415: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 34589 1727204101.52737: in VariableManager get_vars() 34589 1727204101.52755: done with get_vars() 34589 1727204101.52767: done processing included file 34589 1727204101.52769: iterating over new_blocks loaded from include file 34589 1727204101.52771: in VariableManager get_vars() 34589 1727204101.52783: done with get_vars() 34589 1727204101.52898: filtering new block on tags 34589 1727204101.52915: done filtering new block on tags 34589 1727204101.52919: in VariableManager get_vars() 34589 1727204101.52931: done with get_vars() 34589 1727204101.52932: filtering new block on tags 34589 1727204101.52946: done filtering new block on tags 34589 1727204101.52949: in VariableManager get_vars() 34589 1727204101.52959: done with get_vars() 34589 1727204101.52960: filtering new block on tags 34589 1727204101.52973: done filtering new block on tags 34589 1727204101.52975: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node1 34589 1727204101.53084: extending task lists for all hosts with included blocks 34589 1727204101.53140: done extending task lists 34589 1727204101.53141: done processing included files 34589 1727204101.53142: results queue empty 34589 1727204101.53143: checking for any_errors_fatal 34589 1727204101.53144: done checking for any_errors_fatal 34589 1727204101.53145: checking for max_fail_percentage 34589 1727204101.53146: done checking for max_fail_percentage 34589 1727204101.53147: checking to see if all hosts have failed and the running result is not ok 34589 1727204101.53148: done checking to see if all hosts have failed 34589 1727204101.53148: getting the remaining hosts for this loop 34589 1727204101.53149: done getting the remaining hosts for this loop 34589 1727204101.53152: getting the next task for host managed-node1 34589 1727204101.53156: done getting next task for host managed-node1 34589 1727204101.53158: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 34589 1727204101.53160: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204101.53162: getting variables 34589 1727204101.53164: in VariableManager get_vars() 34589 1727204101.53172: Calling all_inventory to load vars for managed-node1 34589 1727204101.53174: Calling groups_inventory to load vars for managed-node1 34589 1727204101.53179: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204101.53184: Calling all_plugins_play to load vars for managed-node1 34589 1727204101.53187: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204101.53190: Calling groups_plugins_play to load vars for managed-node1 34589 1727204101.53535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204101.53946: done with get_vars() 34589 1727204101.53955: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 14:55:01 -0400 (0:00:00.062) 0:00:01.676 ***** 34589 1727204101.54142: entering _queue_task() for managed-node1/setup 34589 1727204101.54901: worker is 1 (out of 1 available) 34589 1727204101.54913: exiting _queue_task() for managed-node1/setup 34589 1727204101.54925: done queuing things up, now waiting for results queue to drain 34589 1727204101.54926: waiting for pending results... 34589 1727204101.55598: running TaskExecutor() for managed-node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 34589 1727204101.55604: in run() - task 028d2410-947f-a9c6-cddc-0000000000b4 34589 1727204101.55607: variable 'ansible_search_path' from source: unknown 34589 1727204101.55610: variable 'ansible_search_path' from source: unknown 34589 1727204101.55614: calling self._execute() 34589 1727204101.55806: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204101.55820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204101.56020: variable 'omit' from source: magic vars 34589 1727204101.57015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204101.61523: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204101.61721: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204101.61766: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204101.61883: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204101.61942: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204101.62139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204101.62177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204101.62267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204101.62349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204101.62458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204101.62773: variable 'ansible_facts' from source: unknown 34589 1727204101.62993: variable 'network_test_required_facts' from source: task vars 34589 1727204101.63109: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 34589 1727204101.63180: variable 'omit' from source: magic vars 34589 1727204101.63183: variable 'omit' from source: magic vars 34589 1727204101.63202: variable 'omit' from source: magic vars 34589 1727204101.63297: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204101.63338: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204101.63536: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204101.63539: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204101.63544: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204101.63547: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204101.63549: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204101.63551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204101.63725: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204101.63785: Set connection var ansible_shell_executable to /bin/sh 34589 1727204101.63797: Set connection var ansible_timeout to 10 34589 1727204101.63990: Set connection var ansible_shell_type to sh 34589 1727204101.63994: Set connection var ansible_connection to ssh 34589 1727204101.63996: Set connection var ansible_pipelining to False 34589 1727204101.63998: variable 'ansible_shell_executable' from source: unknown 34589 1727204101.64000: variable 'ansible_connection' from source: unknown 34589 1727204101.64002: variable 'ansible_module_compression' from source: unknown 34589 1727204101.64004: variable 'ansible_shell_type' from source: unknown 34589 1727204101.64006: variable 'ansible_shell_executable' from source: unknown 34589 1727204101.64008: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204101.64009: variable 'ansible_pipelining' from source: unknown 34589 1727204101.64011: variable 'ansible_timeout' from source: unknown 34589 1727204101.64013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204101.64310: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34589 1727204101.64407: variable 'omit' from source: magic vars 34589 1727204101.64410: starting attempt loop 34589 1727204101.64413: running the handler 34589 1727204101.64416: _low_level_execute_command(): starting 34589 1727204101.64418: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204101.66255: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204101.66260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204101.66273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204101.66920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204101.66939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204101.68672: stdout chunk (state=3): >>>/root <<< 34589 1727204101.68987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204101.68992: stderr chunk (state=3): >>><<< 34589 1727204101.68994: stdout chunk (state=3): >>><<< 34589 1727204101.68997: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204101.69008: _low_level_execute_command(): starting 34589 1727204101.69011: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204101.6892247-35202-79557283187540 `" && echo ansible-tmp-1727204101.6892247-35202-79557283187540="` echo /root/.ansible/tmp/ansible-tmp-1727204101.6892247-35202-79557283187540 `" ) && sleep 0' 34589 1727204101.70343: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204101.70464: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204101.70593: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204101.70607: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204101.70660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204101.70760: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204101.72871: stdout chunk (state=3): >>>ansible-tmp-1727204101.6892247-35202-79557283187540=/root/.ansible/tmp/ansible-tmp-1727204101.6892247-35202-79557283187540 <<< 34589 1727204101.73032: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204101.73135: stdout chunk (state=3): >>><<< 34589 1727204101.73139: stderr chunk (state=3): >>><<< 34589 1727204101.73142: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204101.6892247-35202-79557283187540=/root/.ansible/tmp/ansible-tmp-1727204101.6892247-35202-79557283187540 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204101.73297: variable 'ansible_module_compression' from source: unknown 34589 1727204101.73339: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 34589 1727204101.73489: variable 'ansible_facts' from source: unknown 34589 1727204101.73960: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204101.6892247-35202-79557283187540/AnsiballZ_setup.py 34589 1727204101.74523: Sending initial data 34589 1727204101.74526: Sent initial data (153 bytes) 34589 1727204101.75668: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204101.75892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204101.76056: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204101.76174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204101.77949: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204101.78022: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204101.78099: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmp5lfw0mgo /root/.ansible/tmp/ansible-tmp-1727204101.6892247-35202-79557283187540/AnsiballZ_setup.py <<< 34589 1727204101.78105: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204101.6892247-35202-79557283187540/AnsiballZ_setup.py" <<< 34589 1727204101.78332: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmp5lfw0mgo" to remote "/root/.ansible/tmp/ansible-tmp-1727204101.6892247-35202-79557283187540/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204101.6892247-35202-79557283187540/AnsiballZ_setup.py" <<< 34589 1727204101.81274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204101.81280: stdout chunk (state=3): >>><<< 34589 1727204101.81282: stderr chunk (state=3): >>><<< 34589 1727204101.81285: done transferring module to remote 34589 1727204101.81287: _low_level_execute_command(): starting 34589 1727204101.81289: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204101.6892247-35202-79557283187540/ /root/.ansible/tmp/ansible-tmp-1727204101.6892247-35202-79557283187540/AnsiballZ_setup.py && sleep 0' 34589 1727204101.82852: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204101.82946: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204101.83087: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204101.83135: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204101.83286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204101.85342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204101.85346: stdout chunk (state=3): >>><<< 34589 1727204101.85352: stderr chunk (state=3): >>><<< 34589 1727204101.85558: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204101.85562: _low_level_execute_command(): starting 34589 1727204101.85565: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204101.6892247-35202-79557283187540/AnsiballZ_setup.py && sleep 0' 34589 1727204101.87003: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204101.87111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204101.87122: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204101.87247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204101.89630: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 34589 1727204101.89650: stdout chunk (state=3): >>>import _imp # builtin <<< 34589 1727204101.89676: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 34589 1727204101.89743: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 34589 1727204101.89785: stdout chunk (state=3): >>>import 'posix' # <<< 34589 1727204101.89823: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 34589 1727204101.89854: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 34589 1727204101.89929: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 34589 1727204101.90045: stdout chunk (state=3): >>>import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 34589 1727204101.90158: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bf684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bf37b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bf6aa50> <<< 34589 1727204101.90190: stdout chunk (state=3): >>>import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 34589 1727204101.90383: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 34589 1727204101.90409: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 34589 1727204101.90434: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bd1d130> <<< 34589 1727204101.90502: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 34589 1727204101.90598: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bd1e060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 34589 1727204101.90973: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 34589 1727204101.91086: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204101.91090: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 34589 1727204101.91133: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 34589 1727204101.91200: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bd5bf50> <<< 34589 1727204101.91224: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bd700e0> <<< 34589 1727204101.91259: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 34589 1727204101.91290: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 34589 1727204101.91547: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bd93920> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bd93fb0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bd73bc0> import '_functools' # <<< 34589 1727204101.91551: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bd71340> <<< 34589 1727204101.91711: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bd59100> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 34589 1727204101.91714: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 34589 1727204101.91741: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 34589 1727204101.91744: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 34589 1727204101.91843: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bdb78f0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bdb6510> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bd721e0> <<< 34589 1727204101.91846: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bdb4d10> <<< 34589 1727204101.91941: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bde4950> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bd58380> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 34589 1727204101.91980: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98bde4e00> <<< 34589 1727204101.92081: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bde4cb0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98bde50a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bd56ea0> <<< 34589 1727204101.92085: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204101.92095: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 34589 1727204101.92199: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 34589 1727204101.92312: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bde5790> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bde5460> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bde6660> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 34589 1727204101.92530: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98be00890> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98be01fd0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98be02e70> <<< 34589 1727204101.92541: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98be034a0> <<< 34589 1727204101.92544: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98be023c0> <<< 34589 1727204101.92557: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 34589 1727204101.92602: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204101.92610: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98be03e60> <<< 34589 1727204101.92620: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98be03590> <<< 34589 1727204101.92661: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bde66c0> <<< 34589 1727204101.92687: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 34589 1727204101.92717: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 34589 1727204101.92755: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 34589 1727204101.92796: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98baffd40> <<< 34589 1727204101.92815: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 34589 1727204101.92867: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98bb2c860> <<< 34589 1727204101.93086: stdout chunk (state=3): >>>import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bb2c5c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98bb2c890> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204101.93120: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98bb2d1c0> <<< 34589 1727204101.93262: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204101.93299: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98bb2db80> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bb2ca70> <<< 34589 1727204101.93315: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bafdee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 34589 1727204101.93420: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bb2ef60> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bb2dcd0> <<< 34589 1727204101.93436: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bde6db0> <<< 34589 1727204101.93454: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 34589 1727204101.93527: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204101.93666: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bb532c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 34589 1727204101.93683: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204101.93694: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 34589 1727204101.93719: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 34589 1727204101.93771: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bb77620> <<< 34589 1727204101.93897: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 34589 1727204101.93901: stdout chunk (state=3): >>>import 'ntpath' # <<< 34589 1727204101.94120: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bbd83b0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bbdaae0> <<< 34589 1727204101.94201: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bbd84a0> <<< 34589 1727204101.94231: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bba1430> <<< 34589 1727204101.94266: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b525460> <<< 34589 1727204101.94286: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bb76420> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bb2fec0> <<< 34589 1727204101.94491: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 34589 1727204101.94502: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa98bb76540> <<< 34589 1727204101.94780: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_jnhbidhq/ansible_setup_payload.zip' # zipimport: zlib available <<< 34589 1727204101.94911: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204101.95000: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 34589 1727204101.95093: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 34589 1727204101.95181: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b58f110> <<< 34589 1727204101.95185: stdout chunk (state=3): >>>import '_typing' # <<< 34589 1727204101.95333: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b56e000> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b56d160> # zipimport: zlib available <<< 34589 1727204101.95394: stdout chunk (state=3): >>>import 'ansible' # <<< 34589 1727204101.95419: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 34589 1727204101.95444: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 34589 1727204101.96931: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204101.98211: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b58cfe0> <<< 34589 1727204101.98250: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204101.98256: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 34589 1727204101.98359: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b5bea50> <<< 34589 1727204101.98370: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b5be7e0> <<< 34589 1727204101.98429: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b5be0f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 34589 1727204101.98478: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b5be870> <<< 34589 1727204101.98481: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b58fda0> <<< 34589 1727204101.98578: stdout chunk (state=3): >>>import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b5bf7d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b5bfa10> <<< 34589 1727204101.98581: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 34589 1727204101.98664: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 34589 1727204101.98762: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b5bff50> import 'pwd' # <<< 34589 1727204101.98779: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 34589 1727204101.98798: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b429d90> <<< 34589 1727204101.98829: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204101.98832: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b42b9b0> <<< 34589 1727204101.98924: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 34589 1727204101.98927: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b42c380> <<< 34589 1727204101.99020: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b42d280> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 34589 1727204101.99089: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 34589 1727204101.99151: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b42ff50> <<< 34589 1727204101.99186: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204101.99398: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98be02de0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b42e210> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 34589 1727204101.99461: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 34589 1727204101.99482: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 34589 1727204101.99507: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b437e30> <<< 34589 1727204101.99521: stdout chunk (state=3): >>>import '_tokenize' # <<< 34589 1727204101.99639: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b436900> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b436660> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 34589 1727204101.99693: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b436bd0> <<< 34589 1727204101.99738: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b42e720> <<< 34589 1727204101.99761: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204101.99764: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b47bef0> <<< 34589 1727204101.99849: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b47c200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 34589 1727204101.99865: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 34589 1727204101.99965: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b47dca0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b47da60> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 34589 1727204102.00025: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204102.00064: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b4801d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b47e360> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 34589 1727204102.00172: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 34589 1727204102.00200: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b483980> <<< 34589 1727204102.00328: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b480380> <<< 34589 1727204102.00400: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b484740> <<< 34589 1727204102.00429: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204102.00440: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b484b90> <<< 34589 1727204102.00481: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204102.00613: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b484ad0> <<< 34589 1727204102.00617: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b47c3b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204102.00637: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b3101a0> <<< 34589 1727204102.00804: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204102.00808: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b311730> <<< 34589 1727204102.00837: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b486930> <<< 34589 1727204102.00880: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b487ce0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b486540> <<< 34589 1727204102.00898: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 34589 1727204102.00942: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.01007: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.01164: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 34589 1727204102.01274: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.01290: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.01413: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.02020: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.02705: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204102.02734: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204102.02747: stdout chunk (state=3): >>>import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b3158e0> <<< 34589 1727204102.02821: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 34589 1727204102.02838: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b316660> <<< 34589 1727204102.02907: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b311580> import 'ansible.module_utils.compat.selinux' # <<< 34589 1727204102.03099: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 34589 1727204102.03133: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.03294: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 34589 1727204102.03320: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b316630> <<< 34589 1727204102.03497: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.03817: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.04316: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.04381: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.04460: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 34589 1727204102.04527: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34589 1727204102.04546: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 34589 1727204102.04559: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.04633: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.04715: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 34589 1727204102.04746: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 34589 1727204102.04762: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.04998: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 34589 1727204102.05113: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.05449: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 34589 1727204102.05510: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b317980> <<< 34589 1727204102.05595: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.05606: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.05672: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 34589 1727204102.05691: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 34589 1727204102.05711: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 34589 1727204102.05783: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34589 1727204102.05884: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available <<< 34589 1727204102.05920: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.05993: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.06107: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 34589 1727204102.06110: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204102.06232: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b3221e0> <<< 34589 1727204102.06245: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b31f2f0> <<< 34589 1727204102.06341: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 34589 1727204102.06461: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 34589 1727204102.06496: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 34589 1727204102.06510: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204102.06581: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 34589 1727204102.06641: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 34589 1727204102.06688: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 34589 1727204102.06795: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b40ab40> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b5ea810> <<< 34589 1727204102.06914: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b322330> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b484dd0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 34589 1727204102.06920: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.07005: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 34589 1727204102.07013: stdout chunk (state=3): >>> import 'ansible.module_utils.common.sys_info' # <<< 34589 1727204102.07029: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available <<< 34589 1727204102.07091: stdout chunk (state=3): >>>import 'ansible.modules' # <<< 34589 1727204102.07198: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 34589 1727204102.07229: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.07267: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.07315: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.07390: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.07393: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 34589 1727204102.07443: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.07561: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.07564: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.07578: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.07615: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 34589 1727204102.07800: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.07862: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.07996: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.08035: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.08099: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204102.08137: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 34589 1727204102.08260: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b3b25a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 34589 1727204102.08264: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 34589 1727204102.08277: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 34589 1727204102.08331: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 34589 1727204102.08468: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98af301d0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98af30530> <<< 34589 1727204102.08491: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b39ed20> <<< 34589 1727204102.08573: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b3b30e0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b3b0c80> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b3b0770> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 34589 1727204102.08633: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 34589 1727204102.08651: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 34589 1727204102.08669: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 34589 1727204102.08816: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98af33590> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98af32e40> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98af33020> <<< 34589 1727204102.08850: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98af32270> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 34589 1727204102.08990: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98af33770> <<< 34589 1727204102.09003: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 34589 1727204102.09041: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 34589 1727204102.09069: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204102.09083: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98af922a0> <<< 34589 1727204102.09110: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98af902c0> <<< 34589 1727204102.09150: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b3b08c0> import 'ansible.module_utils.facts.timeout' # <<< 34589 1727204102.09256: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 34589 1727204102.09364: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # <<< 34589 1727204102.09378: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.09412: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.09467: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 34589 1727204102.09492: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 34589 1727204102.09600: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 34589 1727204102.09691: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # <<< 34589 1727204102.09715: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.09828: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 34589 1727204102.09931: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34589 1727204102.09967: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.10029: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 34589 1727204102.10092: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.10564: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.11149: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available <<< 34589 1727204102.11166: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.11202: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.11253: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 34589 1727204102.11279: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 34589 1727204102.11508: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available <<< 34589 1727204102.11535: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 34589 1727204102.11538: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.11563: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.11623: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 34589 1727204102.11693: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.11783: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 34589 1727204102.11843: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98af92510> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 34589 1727204102.11869: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 34589 1727204102.11998: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98af93140> import 'ansible.module_utils.facts.system.local' # <<< 34589 1727204102.12056: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.12082: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.12154: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 34589 1727204102.12169: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.12278: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.12382: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 34589 1727204102.12421: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.12582: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available <<< 34589 1727204102.12611: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 34589 1727204102.12718: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 34589 1727204102.12820: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98afce600> <<< 34589 1727204102.13035: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98afbc950> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 34589 1727204102.13077: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.13148: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 34589 1727204102.13231: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.13317: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.13465: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.13593: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 34589 1727204102.13596: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.13689: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 34589 1727204102.13725: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.13794: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 34589 1727204102.14010: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98afe2270> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98afe1eb0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 34589 1727204102.14153: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.14335: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 34589 1727204102.14434: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.14555: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.14696: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.14701: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 34589 1727204102.14824: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.14970: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 34589 1727204102.14984: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 34589 1727204102.15116: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34589 1727204102.15230: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 34589 1727204102.15335: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 34589 1727204102.15927: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.16469: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 34589 1727204102.16487: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.16587: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.16704: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 34589 1727204102.16710: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.16848: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.16918: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 34589 1727204102.16921: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.17177: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.17247: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 34589 1727204102.17251: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.17266: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 34589 1727204102.17285: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.17322: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.17498: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 34589 1727204102.17591: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34589 1727204102.17797: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.17997: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 34589 1727204102.18016: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 34589 1727204102.18059: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.18166: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 34589 1727204102.18247: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.18322: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 34589 1727204102.18343: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.18382: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 34589 1727204102.18487: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.18612: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available <<< 34589 1727204102.18822: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 34589 1727204102.18919: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.19266: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available <<< 34589 1727204102.19333: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 34589 1727204102.19336: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.19401: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.19417: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 34589 1727204102.19513: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 34589 1727204102.19617: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 34589 1727204102.19662: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.19739: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 34589 1727204102.19843: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 34589 1727204102.19851: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.19944: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available <<< 34589 1727204102.19947: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.19986: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.20045: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.20111: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.20399: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 34589 1727204102.20402: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 34589 1727204102.20620: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available <<< 34589 1727204102.20743: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 34589 1727204102.20849: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 34589 1727204102.20899: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.20956: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 34589 1727204102.20969: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.21043: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.21133: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 34589 1727204102.21154: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.21237: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.21331: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 34589 1727204102.21422: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.21730: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 34589 1727204102.21734: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98ade39e0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98ade0470> <<< 34589 1727204102.21791: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98ade0860> <<< 34589 1727204102.23048: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "02", "epoch": "1727204102", "epoch_int": "1727204102", "date": "2024-09-24", "time": "14:55:02", "iso8601_micro": "2024-09-24T18:55:02.226479Z", "iso8601": "2024-09-24T18:55:02Z", "iso8601_basic": "20240924T145502226479", "iso8601_basic_short": "20240924T145502", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 34589 1727204102.23627: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value <<< 34589 1727204102.23727: stdout chunk (state=3): >>># clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible <<< 34589 1727204102.23953: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy <<< 34589 1727204102.24032: stdout chunk (state=3): >>># destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector <<< 34589 1727204102.24040: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 34589 1727204102.24916: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 34589 1727204102.24930: stdout chunk (state=3): >>># destroy _ssl <<< 34589 1727204102.24964: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 34589 1727204102.25283: stdout chunk (state=3): >>># destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 34589 1727204102.25526: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 34589 1727204102.25530: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections <<< 34589 1727204102.25564: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 34589 1727204102.25599: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 34589 1727204102.25637: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 34589 1727204102.25742: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 34589 1727204102.25878: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re <<< 34589 1727204102.26289: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 34589 1727204102.26474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204102.26480: stdout chunk (state=3): >>><<< 34589 1727204102.26483: stderr chunk (state=3): >>><<< 34589 1727204102.26920: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bf684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bf37b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bf6aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bd1d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bd1e060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bd5bf50> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bd700e0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bd93920> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bd93fb0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bd73bc0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bd71340> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bd59100> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bdb78f0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bdb6510> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bd721e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bdb4d10> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bde4950> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bd58380> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98bde4e00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bde4cb0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98bde50a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bd56ea0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bde5790> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bde5460> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bde6660> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98be00890> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98be01fd0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98be02e70> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98be034a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98be023c0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98be03e60> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98be03590> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bde66c0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98baffd40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98bb2c860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bb2c5c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98bb2c890> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98bb2d1c0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98bb2db80> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bb2ca70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bafdee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bb2ef60> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bb2dcd0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bde6db0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bb532c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bb77620> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bbd83b0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bbdaae0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bbd84a0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bba1430> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b525460> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bb76420> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98bb2fec0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa98bb76540> # zipimport: found 103 names in '/tmp/ansible_setup_payload_jnhbidhq/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b58f110> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b56e000> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b56d160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b58cfe0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b5bea50> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b5be7e0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b5be0f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b5be870> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b58fda0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b5bf7d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b5bfa10> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b5bff50> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b429d90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b42b9b0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b42c380> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b42d280> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b42ff50> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98be02de0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b42e210> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b437e30> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b436900> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b436660> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b436bd0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b42e720> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b47bef0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b47c200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b47dca0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b47da60> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b4801d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b47e360> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b483980> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b480380> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b484740> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b484b90> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b484ad0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b47c3b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b3101a0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b311730> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b486930> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b487ce0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b486540> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b3158e0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b316660> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b311580> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b316630> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b317980> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98b3221e0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b31f2f0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b40ab40> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b5ea810> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b322330> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b484dd0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b3b25a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98af301d0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98af30530> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b39ed20> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b3b30e0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b3b0c80> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b3b0770> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98af33590> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98af32e40> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98af33020> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98af32270> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98af33770> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98af922a0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98af902c0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98b3b08c0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98af92510> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98af93140> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98afce600> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98afbc950> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98afe2270> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98afe1eb0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa98ade39e0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98ade0470> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa98ade0860> {"ansible_facts": {"ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "02", "epoch": "1727204102", "epoch_int": "1727204102", "date": "2024-09-24", "time": "14:55:02", "iso8601_micro": "2024-09-24T18:55:02.226479Z", "iso8601": "2024-09-24T18:55:02Z", "iso8601_basic": "20240924T145502226479", "iso8601_basic_short": "20240924T145502", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 34589 1727204102.29804: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204101.6892247-35202-79557283187540/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204102.29811: _low_level_execute_command(): starting 34589 1727204102.29813: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204101.6892247-35202-79557283187540/ > /dev/null 2>&1 && sleep 0' 34589 1727204102.29815: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204102.29885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204102.29888: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204102.30067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204102.30070: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204102.30088: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204102.30213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204102.32314: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204102.32318: stdout chunk (state=3): >>><<< 34589 1727204102.32321: stderr chunk (state=3): >>><<< 34589 1727204102.32401: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204102.32408: handler run complete 34589 1727204102.32587: variable 'ansible_facts' from source: unknown 34589 1727204102.32804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204102.32913: variable 'ansible_facts' from source: unknown 34589 1727204102.33026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204102.33290: attempt loop complete, returning result 34589 1727204102.33294: _execute() done 34589 1727204102.33296: dumping result to json 34589 1727204102.33298: done dumping result, returning 34589 1727204102.33300: done running TaskExecutor() for managed-node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [028d2410-947f-a9c6-cddc-0000000000b4] 34589 1727204102.33302: sending task result for task 028d2410-947f-a9c6-cddc-0000000000b4 ok: [managed-node1] 34589 1727204102.33588: no more pending results, returning what we have 34589 1727204102.33590: results queue empty 34589 1727204102.33591: checking for any_errors_fatal 34589 1727204102.33592: done checking for any_errors_fatal 34589 1727204102.33593: checking for max_fail_percentage 34589 1727204102.33595: done checking for max_fail_percentage 34589 1727204102.33595: checking to see if all hosts have failed and the running result is not ok 34589 1727204102.33596: done checking to see if all hosts have failed 34589 1727204102.33597: getting the remaining hosts for this loop 34589 1727204102.33598: done getting the remaining hosts for this loop 34589 1727204102.33602: getting the next task for host managed-node1 34589 1727204102.33612: done getting next task for host managed-node1 34589 1727204102.33615: ^ task is: TASK: Check if system is ostree 34589 1727204102.33617: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204102.33621: getting variables 34589 1727204102.33623: in VariableManager get_vars() 34589 1727204102.33655: Calling all_inventory to load vars for managed-node1 34589 1727204102.33659: Calling groups_inventory to load vars for managed-node1 34589 1727204102.33662: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204102.34016: Calling all_plugins_play to load vars for managed-node1 34589 1727204102.34022: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204102.34028: Calling groups_plugins_play to load vars for managed-node1 34589 1727204102.34623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204102.35061: done with get_vars() 34589 1727204102.35073: done getting variables 34589 1727204102.35182: done sending task result for task 028d2410-947f-a9c6-cddc-0000000000b4 34589 1727204102.35186: WORKER PROCESS EXITING TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 14:55:02 -0400 (0:00:00.812) 0:00:02.488 ***** 34589 1727204102.35368: entering _queue_task() for managed-node1/stat 34589 1727204102.36224: worker is 1 (out of 1 available) 34589 1727204102.36238: exiting _queue_task() for managed-node1/stat 34589 1727204102.36249: done queuing things up, now waiting for results queue to drain 34589 1727204102.36250: waiting for pending results... 34589 1727204102.36797: running TaskExecutor() for managed-node1/TASK: Check if system is ostree 34589 1727204102.37396: in run() - task 028d2410-947f-a9c6-cddc-0000000000b6 34589 1727204102.37401: variable 'ansible_search_path' from source: unknown 34589 1727204102.37404: variable 'ansible_search_path' from source: unknown 34589 1727204102.37434: calling self._execute() 34589 1727204102.37760: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204102.37765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204102.37767: variable 'omit' from source: magic vars 34589 1727204102.38964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204102.39489: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204102.39558: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204102.39719: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204102.39896: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204102.40141: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204102.40172: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204102.40224: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204102.40424: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204102.40603: Evaluated conditional (not __network_is_ostree is defined): True 34589 1727204102.40618: variable 'omit' from source: magic vars 34589 1727204102.40878: variable 'omit' from source: magic vars 34589 1727204102.40881: variable 'omit' from source: magic vars 34589 1727204102.40900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204102.40937: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204102.40991: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204102.41013: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204102.41026: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204102.41056: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204102.41064: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204102.41074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204102.41184: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204102.41211: Set connection var ansible_shell_executable to /bin/sh 34589 1727204102.41227: Set connection var ansible_timeout to 10 34589 1727204102.41236: Set connection var ansible_shell_type to sh 34589 1727204102.41248: Set connection var ansible_connection to ssh 34589 1727204102.41263: Set connection var ansible_pipelining to False 34589 1727204102.41318: variable 'ansible_shell_executable' from source: unknown 34589 1727204102.41325: variable 'ansible_connection' from source: unknown 34589 1727204102.41332: variable 'ansible_module_compression' from source: unknown 34589 1727204102.41338: variable 'ansible_shell_type' from source: unknown 34589 1727204102.41343: variable 'ansible_shell_executable' from source: unknown 34589 1727204102.41349: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204102.41356: variable 'ansible_pipelining' from source: unknown 34589 1727204102.41362: variable 'ansible_timeout' from source: unknown 34589 1727204102.41369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204102.41552: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34589 1727204102.41635: variable 'omit' from source: magic vars 34589 1727204102.41639: starting attempt loop 34589 1727204102.41641: running the handler 34589 1727204102.41644: _low_level_execute_command(): starting 34589 1727204102.41647: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204102.42417: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204102.42442: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204102.42522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204102.42577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204102.42600: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204102.42632: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204102.43000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204102.44733: stdout chunk (state=3): >>>/root <<< 34589 1727204102.44771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204102.44781: stdout chunk (state=3): >>><<< 34589 1727204102.44788: stderr chunk (state=3): >>><<< 34589 1727204102.45003: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204102.45023: _low_level_execute_command(): starting 34589 1727204102.45034: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204102.4500241-35226-125181253773055 `" && echo ansible-tmp-1727204102.4500241-35226-125181253773055="` echo /root/.ansible/tmp/ansible-tmp-1727204102.4500241-35226-125181253773055 `" ) && sleep 0' 34589 1727204102.46454: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204102.46458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34589 1727204102.46461: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 34589 1727204102.46464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204102.46555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204102.46560: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204102.46656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204102.48771: stdout chunk (state=3): >>>ansible-tmp-1727204102.4500241-35226-125181253773055=/root/.ansible/tmp/ansible-tmp-1727204102.4500241-35226-125181253773055 <<< 34589 1727204102.48990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204102.48994: stdout chunk (state=3): >>><<< 34589 1727204102.48997: stderr chunk (state=3): >>><<< 34589 1727204102.49240: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204102.4500241-35226-125181253773055=/root/.ansible/tmp/ansible-tmp-1727204102.4500241-35226-125181253773055 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204102.49244: variable 'ansible_module_compression' from source: unknown 34589 1727204102.49246: ANSIBALLZ: Using lock for stat 34589 1727204102.49248: ANSIBALLZ: Acquiring lock 34589 1727204102.49250: ANSIBALLZ: Lock acquired: 140222016673392 34589 1727204102.49252: ANSIBALLZ: Creating module 34589 1727204102.72867: ANSIBALLZ: Writing module into payload 34589 1727204102.73110: ANSIBALLZ: Writing module 34589 1727204102.73387: ANSIBALLZ: Renaming module 34589 1727204102.73392: ANSIBALLZ: Done creating module 34589 1727204102.73395: variable 'ansible_facts' from source: unknown 34589 1727204102.73455: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204102.4500241-35226-125181253773055/AnsiballZ_stat.py 34589 1727204102.73877: Sending initial data 34589 1727204102.73882: Sent initial data (153 bytes) 34589 1727204102.75344: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204102.75350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204102.75489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204102.75665: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204102.75809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204102.77564: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204102.77638: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204102.77736: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmp1y8ex2oi /root/.ansible/tmp/ansible-tmp-1727204102.4500241-35226-125181253773055/AnsiballZ_stat.py <<< 34589 1727204102.77760: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204102.4500241-35226-125181253773055/AnsiballZ_stat.py" <<< 34589 1727204102.77875: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmp1y8ex2oi" to remote "/root/.ansible/tmp/ansible-tmp-1727204102.4500241-35226-125181253773055/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204102.4500241-35226-125181253773055/AnsiballZ_stat.py" <<< 34589 1727204102.79753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204102.79759: stderr chunk (state=3): >>><<< 34589 1727204102.79762: stdout chunk (state=3): >>><<< 34589 1727204102.79764: done transferring module to remote 34589 1727204102.79847: _low_level_execute_command(): starting 34589 1727204102.79851: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204102.4500241-35226-125181253773055/ /root/.ansible/tmp/ansible-tmp-1727204102.4500241-35226-125181253773055/AnsiballZ_stat.py && sleep 0' 34589 1727204102.81039: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204102.81071: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204102.81135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204102.81199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204102.81203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204102.81283: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204102.81297: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204102.81385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204102.83470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204102.83474: stdout chunk (state=3): >>><<< 34589 1727204102.83479: stderr chunk (state=3): >>><<< 34589 1727204102.83580: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204102.83592: _low_level_execute_command(): starting 34589 1727204102.83595: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204102.4500241-35226-125181253773055/AnsiballZ_stat.py && sleep 0' 34589 1727204102.84348: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204102.84439: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34589 1727204102.84456: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204102.84557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204102.84589: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204102.84798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204102.87118: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 34589 1727204102.87123: stdout chunk (state=3): >>>import _imp # builtin <<< 34589 1727204102.87125: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 34589 1727204102.87184: stdout chunk (state=3): >>>import '_io' # <<< 34589 1727204102.87194: stdout chunk (state=3): >>>import 'marshal' # <<< 34589 1727204102.87216: stdout chunk (state=3): >>>import 'posix' # <<< 34589 1727204102.87250: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 34589 1727204102.87301: stdout chunk (state=3): >>>import 'time' # <<< 34589 1727204102.87305: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 34589 1727204102.87351: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204102.87394: stdout chunk (state=3): >>>import '_codecs' # <<< 34589 1727204102.87399: stdout chunk (state=3): >>>import 'codecs' # <<< 34589 1727204102.87425: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 34589 1727204102.87490: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 34589 1727204102.87519: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf4184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf3e7b30> <<< 34589 1727204102.87533: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf41aa50> import '_signal' # <<< 34589 1727204102.87572: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 34589 1727204102.87589: stdout chunk (state=3): >>>import 'io' # <<< 34589 1727204102.87622: stdout chunk (state=3): >>>import '_stat' # <<< 34589 1727204102.87631: stdout chunk (state=3): >>>import 'stat' # <<< 34589 1727204102.87708: stdout chunk (state=3): >>>import '_collections_abc' # <<< 34589 1727204102.87738: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 34589 1727204102.87771: stdout chunk (state=3): >>>import 'os' # <<< 34589 1727204102.87817: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 34589 1727204102.87833: stdout chunk (state=3): >>>Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 34589 1727204102.87867: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 34589 1727204102.87884: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 34589 1727204102.87910: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf209130> <<< 34589 1727204102.87976: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204102.87990: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf20a060> <<< 34589 1727204102.88020: stdout chunk (state=3): >>>import 'site' # <<< 34589 1727204102.88101: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 34589 1727204102.88269: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 34589 1727204102.88297: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 34589 1727204102.88330: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 34589 1727204102.88333: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204102.88356: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 34589 1727204102.88386: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 34589 1727204102.88401: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 34589 1727204102.88432: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 34589 1727204102.88445: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf247f20> <<< 34589 1727204102.88512: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 34589 1727204102.88528: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf25c0b0> <<< 34589 1727204102.88554: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 34589 1727204102.88566: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 34589 1727204102.88592: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 34589 1727204102.88648: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204102.88693: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf27f950> <<< 34589 1727204102.88716: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 34589 1727204102.88737: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 34589 1727204102.88741: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf27ffe0> <<< 34589 1727204102.88749: stdout chunk (state=3): >>>import '_collections' # <<< 34589 1727204102.88819: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf25fbf0> <<< 34589 1727204102.88823: stdout chunk (state=3): >>>import '_functools' # <<< 34589 1727204102.88851: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf25d310> <<< 34589 1727204102.88947: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2450d0> <<< 34589 1727204102.88978: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 34589 1727204102.88998: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 34589 1727204102.89014: stdout chunk (state=3): >>>import '_sre' # <<< 34589 1727204102.89033: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 34589 1727204102.89078: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 34589 1727204102.89082: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 34589 1727204102.89095: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 34589 1727204102.89124: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2a3890> <<< 34589 1727204102.89141: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2a24b0> <<< 34589 1727204102.89172: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' <<< 34589 1727204102.89186: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf25e1e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2a0ce0> <<< 34589 1727204102.89255: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 34589 1727204102.89259: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2d08f0> <<< 34589 1727204102.89281: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf244350> <<< 34589 1727204102.89288: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 34589 1727204102.89340: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aaf2d0da0> <<< 34589 1727204102.89358: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2d0c50> <<< 34589 1727204102.89403: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aaf2d1010> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf242e70> <<< 34589 1727204102.89433: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204102.89450: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 34589 1727204102.89511: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2d16a0> <<< 34589 1727204102.89517: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2d13a0> import 'importlib.machinery' # <<< 34589 1727204102.89581: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 34589 1727204102.89585: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2d25a0> <<< 34589 1727204102.89587: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 34589 1727204102.89617: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 34589 1727204102.89651: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 34589 1727204102.89691: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 34589 1727204102.89694: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2ec7d0> <<< 34589 1727204102.89707: stdout chunk (state=3): >>>import 'errno' # <<< 34589 1727204102.89759: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aaf2edf10> <<< 34589 1727204102.89779: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 34589 1727204102.89804: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 34589 1727204102.89816: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 34589 1727204102.89824: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2eedb0> <<< 34589 1727204102.89862: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aaf2ef410> <<< 34589 1727204102.89875: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2ee300> <<< 34589 1727204102.89910: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 34589 1727204102.89917: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 34589 1727204102.90014: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204102.90018: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aaf2efe90> <<< 34589 1727204102.90020: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2ef5c0> <<< 34589 1727204102.90050: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2d2600> <<< 34589 1727204102.90056: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 34589 1727204102.90142: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 34589 1727204102.90146: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 34589 1727204102.90148: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 34589 1727204102.90183: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aaf067da0> <<< 34589 1727204102.90189: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 34589 1727204102.90242: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aaf090830> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf090590> <<< 34589 1727204102.90247: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204102.90293: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aaf0907a0> <<< 34589 1727204102.90315: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 34589 1727204102.90393: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204102.90490: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aaf091160> <<< 34589 1727204102.90649: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aaf091b50> <<< 34589 1727204102.90701: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf090a10> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf065f40> <<< 34589 1727204102.90716: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 34589 1727204102.90732: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 34589 1727204102.90808: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 34589 1727204102.90840: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf092f60> <<< 34589 1727204102.90843: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf091ca0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2d2cf0> <<< 34589 1727204102.90890: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 34589 1727204102.90919: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204102.90935: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 34589 1727204102.90984: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 34589 1727204102.91029: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf0bb230> <<< 34589 1727204102.91061: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 34589 1727204102.91134: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 34589 1727204102.91162: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf0e3590> <<< 34589 1727204102.91181: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 34589 1727204102.91229: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 34589 1727204102.91295: stdout chunk (state=3): >>>import 'ntpath' # <<< 34589 1727204102.91338: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf140350> <<< 34589 1727204102.91355: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 34589 1727204102.91372: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 34589 1727204102.91391: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 34589 1727204102.91440: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 34589 1727204102.91525: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf142ab0> <<< 34589 1727204102.91651: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf140470> <<< 34589 1727204102.91655: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf109370> <<< 34589 1727204102.91690: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaef41430> <<< 34589 1727204102.91715: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf0e2390> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf093e00> <<< 34589 1727204102.91826: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 34589 1727204102.91844: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f8aaf0e2990> <<< 34589 1727204102.92036: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_1o4c0iuo/ansible_stat_payload.zip' # zipimport: zlib available <<< 34589 1727204102.92196: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 34589 1727204102.92230: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 34589 1727204102.92270: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 34589 1727204102.92472: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 34589 1727204102.92477: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaef971a0> import '_typing' # <<< 34589 1727204102.92617: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaef76090> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaef75220> <<< 34589 1727204102.92652: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.92717: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 34589 1727204102.92729: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.94190: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.95430: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 34589 1727204102.95436: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaef95070> <<< 34589 1727204102.95470: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204102.95477: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 34589 1727204102.95483: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 34589 1727204102.95507: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 34589 1727204102.95513: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 34589 1727204102.95541: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204102.95546: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aaefbeab0> <<< 34589 1727204102.95595: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaefbe840> <<< 34589 1727204102.95624: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaefbe150> <<< 34589 1727204102.95652: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 34589 1727204102.95655: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 34589 1727204102.95691: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaefbe5a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf090440> <<< 34589 1727204102.95696: stdout chunk (state=3): >>>import 'atexit' # <<< 34589 1727204102.95736: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aaefbf800> <<< 34589 1727204102.95772: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aaefbfa40> <<< 34589 1727204102.95783: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 34589 1727204102.95851: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 34589 1727204102.95856: stdout chunk (state=3): >>>import '_locale' # <<< 34589 1727204102.95908: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaefbff80> <<< 34589 1727204102.95911: stdout chunk (state=3): >>>import 'pwd' # <<< 34589 1727204102.95944: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 34589 1727204102.95967: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 34589 1727204102.96011: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae909bb0> <<< 34589 1727204102.96036: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204102.96058: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aae90b860> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 34589 1727204102.96082: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 34589 1727204102.96123: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae90c230> <<< 34589 1727204102.96143: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 34589 1727204102.96177: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 34589 1727204102.96217: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae90d0d0> <<< 34589 1727204102.96220: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 34589 1727204102.96277: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 34589 1727204102.96290: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 34589 1727204102.96336: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae90fe60> <<< 34589 1727204102.96411: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aaf092f30> <<< 34589 1727204102.96414: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae90e120> <<< 34589 1727204102.96452: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 34589 1727204102.96459: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 34589 1727204102.96481: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 34589 1727204102.96501: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 34589 1727204102.96544: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 34589 1727204102.96580: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae91bd40> <<< 34589 1727204102.96590: stdout chunk (state=3): >>>import '_tokenize' # <<< 34589 1727204102.96664: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae91a810> <<< 34589 1727204102.96667: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae91a570> <<< 34589 1727204102.96689: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 34589 1727204102.96696: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 34589 1727204102.96775: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae91aae0> <<< 34589 1727204102.96812: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae90e630> <<< 34589 1727204102.96836: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aae95ffe0> <<< 34589 1727204102.96877: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204102.96902: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae960170> <<< 34589 1727204102.96924: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 34589 1727204102.96947: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 34589 1727204102.96968: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204102.97008: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aae961be0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae9619a0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 34589 1727204102.97142: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 34589 1727204102.97208: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204102.97212: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aae9640e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae962240> <<< 34589 1727204102.97241: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 34589 1727204102.97282: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204102.97298: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 34589 1727204102.97324: stdout chunk (state=3): >>>import '_string' # <<< 34589 1727204102.97357: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae9677d0> <<< 34589 1727204102.97504: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae9641d0> <<< 34589 1727204102.97567: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aae968500> <<< 34589 1727204102.97590: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204102.97599: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aae9689b0> <<< 34589 1727204102.97633: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204102.97637: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aae968aa0> <<< 34589 1727204102.97647: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae960320> <<< 34589 1727204102.97674: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 34589 1727204102.97710: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 34589 1727204102.97724: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 34589 1727204102.97751: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204102.97778: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aae9f41d0><<< 34589 1727204102.97782: stdout chunk (state=3): >>> <<< 34589 1727204102.97942: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204102.97948: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aae9f5370> <<< 34589 1727204102.97955: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae96a960> <<< 34589 1727204102.97981: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204102.97986: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aae96bd10> <<< 34589 1727204102.98020: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae96a5a0> # zipimport: zlib available <<< 34589 1727204102.98023: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.98033: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # <<< 34589 1727204102.98044: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.98139: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.98234: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.98239: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 34589 1727204102.98264: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.98277: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.98287: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 34589 1727204102.98292: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.98418: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.98534: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.99124: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204102.99701: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 34589 1727204102.99706: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 34589 1727204102.99728: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 34589 1727204102.99752: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204102.99813: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 34589 1727204102.99818: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aae9f9670> <<< 34589 1727204102.99895: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 34589 1727204102.99902: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 34589 1727204102.99920: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae9fa360> <<< 34589 1727204102.99926: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae9f5820> <<< 34589 1727204102.99970: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 34589 1727204103.00019: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.00032: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 34589 1727204103.00197: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.00367: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 34589 1727204103.00383: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae9f8f20> # zipimport: zlib available <<< 34589 1727204103.00881: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.01354: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.01432: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.01509: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 34589 1727204103.01519: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.01551: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.01596: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 34589 1727204103.01670: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.01752: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 34589 1727204103.01780: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.01786: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.01789: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 34589 1727204103.01804: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.01842: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.01878: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 34589 1727204103.01893: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.02134: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.02373: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 34589 1727204103.02451: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 34589 1727204103.02464: stdout chunk (state=3): >>>import '_ast' # <<< 34589 1727204103.02540: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae9fb560> <<< 34589 1727204103.02547: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.02622: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.02703: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 34589 1727204103.02713: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 34589 1727204103.02724: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 34589 1727204103.02737: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.02783: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.02822: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 34589 1727204103.02830: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.02877: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.02919: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.02978: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.03048: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 34589 1727204103.03101: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204103.03192: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aae806060> <<< 34589 1727204103.03229: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae801790> <<< 34589 1727204103.03263: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 34589 1727204103.03267: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 34589 1727204103.03342: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.03402: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.03435: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.03479: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 34589 1727204103.03488: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 34589 1727204103.03500: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 34589 1727204103.03524: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 34589 1727204103.03546: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 34589 1727204103.03621: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 34589 1727204103.03633: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 34589 1727204103.03655: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 34589 1727204103.03716: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaeff6930> <<< 34589 1727204103.03760: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf00e600> <<< 34589 1727204103.03844: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae8060f0> <<< 34589 1727204103.03853: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae96a7b0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 34589 1727204103.03856: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.03891: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.03920: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 34589 1727204103.03925: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 34589 1727204103.03980: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 34589 1727204103.04007: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.04018: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 34589 1727204103.04025: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.04177: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.04389: stdout chunk (state=3): >>># zipimport: zlib available <<< 34589 1727204103.04531: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 34589 1727204103.04931: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 <<< 34589 1727204103.04937: stdout chunk (state=3): >>># clear sys.last_exc # clear sys.last_type <<< 34589 1727204103.04964: stdout chunk (state=3): >>># clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings <<< 34589 1727204103.05000: stdout chunk (state=3): >>># cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct <<< 34589 1727204103.05028: stdout chunk (state=3): >>># cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib <<< 34589 1727204103.05038: stdout chunk (state=3): >>># cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token <<< 34589 1727204103.05066: stdout chunk (state=3): >>># cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon <<< 34589 1727204103.05102: stdout chunk (state=3): >>># cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast <<< 34589 1727204103.05105: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info <<< 34589 1727204103.05112: stdout chunk (state=3): >>># destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 34589 1727204103.05388: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 34589 1727204103.05398: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 34589 1727204103.05425: stdout chunk (state=3): >>># destroy _bz2 <<< 34589 1727204103.05433: stdout chunk (state=3): >>># destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 34589 1727204103.05450: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 34589 1727204103.05474: stdout chunk (state=3): >>># destroy ntpath <<< 34589 1727204103.05504: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib <<< 34589 1727204103.05521: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 34589 1727204103.05528: stdout chunk (state=3): >>># destroy _locale # destroy pwd <<< 34589 1727204103.05548: stdout chunk (state=3): >>># destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 34589 1727204103.05577: stdout chunk (state=3): >>># destroy selectors # destroy errno <<< 34589 1727204103.05589: stdout chunk (state=3): >>># destroy array # destroy datetime <<< 34589 1727204103.05613: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 34589 1727204103.05616: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 34589 1727204103.05671: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 34589 1727204103.05700: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc <<< 34589 1727204103.05704: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 34589 1727204103.05706: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 34589 1727204103.05736: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random <<< 34589 1727204103.05751: stdout chunk (state=3): >>># cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings <<< 34589 1727204103.05778: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 34589 1727204103.05793: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections <<< 34589 1727204103.05804: stdout chunk (state=3): >>># destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 34589 1727204103.05828: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 <<< 34589 1727204103.05846: stdout chunk (state=3): >>># cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io <<< 34589 1727204103.05868: stdout chunk (state=3): >>># cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 34589 1727204103.05875: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 34589 1727204103.06066: stdout chunk (state=3): >>># destroy sys.monitoring <<< 34589 1727204103.06074: stdout chunk (state=3): >>># destroy _socket <<< 34589 1727204103.06081: stdout chunk (state=3): >>># destroy _collections <<< 34589 1727204103.06111: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 34589 1727204103.06119: stdout chunk (state=3): >>># destroy tokenize <<< 34589 1727204103.06137: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 34589 1727204103.06174: stdout chunk (state=3): >>># destroy _typing <<< 34589 1727204103.06179: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator <<< 34589 1727204103.06189: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves <<< 34589 1727204103.06209: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 34589 1727204103.06227: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 34589 1727204103.06314: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 34589 1727204103.06328: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit <<< 34589 1727204103.06331: stdout chunk (state=3): >>># destroy _warnings # destroy math # destroy _bisect # destroy time <<< 34589 1727204103.06358: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 34589 1727204103.06377: stdout chunk (state=3): >>># destroy _hashlib <<< 34589 1727204103.06397: stdout chunk (state=3): >>># destroy _operator # destroy _string # destroy re <<< 34589 1727204103.06406: stdout chunk (state=3): >>># destroy itertools <<< 34589 1727204103.06425: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins <<< 34589 1727204103.06436: stdout chunk (state=3): >>># destroy _thread <<< 34589 1727204103.06441: stdout chunk (state=3): >>># clear sys.audit hooks <<< 34589 1727204103.06790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204103.06819: stderr chunk (state=3): >>><<< 34589 1727204103.06822: stdout chunk (state=3): >>><<< 34589 1727204103.06887: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf4184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf3e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf41aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf209130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf20a060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf247f20> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf25c0b0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf27f950> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf27ffe0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf25fbf0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf25d310> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2450d0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2a3890> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2a24b0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf25e1e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2a0ce0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2d08f0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf244350> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aaf2d0da0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2d0c50> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aaf2d1010> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf242e70> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2d16a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2d13a0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2d25a0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2ec7d0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aaf2edf10> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2eedb0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aaf2ef410> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2ee300> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aaf2efe90> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2ef5c0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2d2600> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aaf067da0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aaf090830> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf090590> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aaf0907a0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aaf091160> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aaf091b50> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf090a10> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf065f40> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf092f60> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf091ca0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf2d2cf0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf0bb230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf0e3590> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf140350> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf142ab0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf140470> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf109370> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaef41430> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf0e2390> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf093e00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f8aaf0e2990> # zipimport: found 30 names in '/tmp/ansible_stat_payload_1o4c0iuo/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaef971a0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaef76090> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaef75220> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaef95070> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aaefbeab0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaefbe840> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaefbe150> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaefbe5a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf090440> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aaefbf800> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aaefbfa40> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaefbff80> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae909bb0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aae90b860> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae90c230> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae90d0d0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae90fe60> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aaf092f30> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae90e120> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae91bd40> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae91a810> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae91a570> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae91aae0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae90e630> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aae95ffe0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae960170> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aae961be0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae9619a0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aae9640e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae962240> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae9677d0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae9641d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aae968500> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aae9689b0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aae968aa0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae960320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aae9f41d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aae9f5370> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae96a960> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aae96bd10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae96a5a0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aae9f9670> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae9fa360> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae9f5820> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae9f8f20> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae9fb560> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aae806060> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae801790> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaeff6930> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aaf00e600> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae8060f0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aae96a7b0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 34589 1727204103.07406: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204102.4500241-35226-125181253773055/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204103.07410: _low_level_execute_command(): starting 34589 1727204103.07412: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204102.4500241-35226-125181253773055/ > /dev/null 2>&1 && sleep 0' 34589 1727204103.07618: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204103.07622: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204103.07625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204103.07627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204103.07630: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204103.07632: stderr chunk (state=3): >>>debug2: match not found <<< 34589 1727204103.07653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204103.07655: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34589 1727204103.07658: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 34589 1727204103.07663: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 34589 1727204103.07673: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204103.07727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204103.07730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204103.07732: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204103.07734: stderr chunk (state=3): >>>debug2: match found <<< 34589 1727204103.07736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204103.07778: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204103.07795: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204103.07807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204103.07915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204103.09890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204103.09931: stderr chunk (state=3): >>><<< 34589 1727204103.09934: stdout chunk (state=3): >>><<< 34589 1727204103.09950: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204103.09956: handler run complete 34589 1727204103.09971: attempt loop complete, returning result 34589 1727204103.09973: _execute() done 34589 1727204103.09978: dumping result to json 34589 1727204103.09981: done dumping result, returning 34589 1727204103.09989: done running TaskExecutor() for managed-node1/TASK: Check if system is ostree [028d2410-947f-a9c6-cddc-0000000000b6] 34589 1727204103.09992: sending task result for task 028d2410-947f-a9c6-cddc-0000000000b6 34589 1727204103.10081: done sending task result for task 028d2410-947f-a9c6-cddc-0000000000b6 34589 1727204103.10084: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 34589 1727204103.10146: no more pending results, returning what we have 34589 1727204103.10148: results queue empty 34589 1727204103.10149: checking for any_errors_fatal 34589 1727204103.10154: done checking for any_errors_fatal 34589 1727204103.10155: checking for max_fail_percentage 34589 1727204103.10157: done checking for max_fail_percentage 34589 1727204103.10157: checking to see if all hosts have failed and the running result is not ok 34589 1727204103.10158: done checking to see if all hosts have failed 34589 1727204103.10159: getting the remaining hosts for this loop 34589 1727204103.10160: done getting the remaining hosts for this loop 34589 1727204103.10163: getting the next task for host managed-node1 34589 1727204103.10169: done getting next task for host managed-node1 34589 1727204103.10171: ^ task is: TASK: Set flag to indicate system is ostree 34589 1727204103.10174: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204103.10179: getting variables 34589 1727204103.10181: in VariableManager get_vars() 34589 1727204103.10215: Calling all_inventory to load vars for managed-node1 34589 1727204103.10218: Calling groups_inventory to load vars for managed-node1 34589 1727204103.10221: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204103.10232: Calling all_plugins_play to load vars for managed-node1 34589 1727204103.10234: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204103.10237: Calling groups_plugins_play to load vars for managed-node1 34589 1727204103.10489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204103.10679: done with get_vars() 34589 1727204103.10690: done getting variables 34589 1727204103.10799: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 14:55:03 -0400 (0:00:00.754) 0:00:03.243 ***** 34589 1727204103.10840: entering _queue_task() for managed-node1/set_fact 34589 1727204103.10842: Creating lock for set_fact 34589 1727204103.11144: worker is 1 (out of 1 available) 34589 1727204103.11267: exiting _queue_task() for managed-node1/set_fact 34589 1727204103.11280: done queuing things up, now waiting for results queue to drain 34589 1727204103.11281: waiting for pending results... 34589 1727204103.11490: running TaskExecutor() for managed-node1/TASK: Set flag to indicate system is ostree 34589 1727204103.11584: in run() - task 028d2410-947f-a9c6-cddc-0000000000b7 34589 1727204103.11588: variable 'ansible_search_path' from source: unknown 34589 1727204103.11591: variable 'ansible_search_path' from source: unknown 34589 1727204103.11617: calling self._execute() 34589 1727204103.11701: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204103.11714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204103.11781: variable 'omit' from source: magic vars 34589 1727204103.12313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204103.12574: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204103.12628: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204103.12673: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204103.12714: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204103.12812: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204103.12874: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204103.12882: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204103.12909: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204103.13043: Evaluated conditional (not __network_is_ostree is defined): True 34589 1727204103.13055: variable 'omit' from source: magic vars 34589 1727204103.13107: variable 'omit' from source: magic vars 34589 1727204103.13315: variable '__ostree_booted_stat' from source: set_fact 34589 1727204103.13318: variable 'omit' from source: magic vars 34589 1727204103.13327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204103.13359: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204103.13382: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204103.13402: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204103.13422: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204103.13456: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204103.13464: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204103.13471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204103.13582: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204103.13592: Set connection var ansible_shell_executable to /bin/sh 34589 1727204103.13603: Set connection var ansible_timeout to 10 34589 1727204103.13612: Set connection var ansible_shell_type to sh 34589 1727204103.13638: Set connection var ansible_connection to ssh 34589 1727204103.13641: Set connection var ansible_pipelining to False 34589 1727204103.13665: variable 'ansible_shell_executable' from source: unknown 34589 1727204103.13780: variable 'ansible_connection' from source: unknown 34589 1727204103.13783: variable 'ansible_module_compression' from source: unknown 34589 1727204103.13785: variable 'ansible_shell_type' from source: unknown 34589 1727204103.13787: variable 'ansible_shell_executable' from source: unknown 34589 1727204103.13789: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204103.13791: variable 'ansible_pipelining' from source: unknown 34589 1727204103.13792: variable 'ansible_timeout' from source: unknown 34589 1727204103.13794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204103.13823: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204103.13838: variable 'omit' from source: magic vars 34589 1727204103.13847: starting attempt loop 34589 1727204103.13853: running the handler 34589 1727204103.13867: handler run complete 34589 1727204103.13881: attempt loop complete, returning result 34589 1727204103.13887: _execute() done 34589 1727204103.13893: dumping result to json 34589 1727204103.13912: done dumping result, returning 34589 1727204103.14022: done running TaskExecutor() for managed-node1/TASK: Set flag to indicate system is ostree [028d2410-947f-a9c6-cddc-0000000000b7] 34589 1727204103.14025: sending task result for task 028d2410-947f-a9c6-cddc-0000000000b7 34589 1727204103.14091: done sending task result for task 028d2410-947f-a9c6-cddc-0000000000b7 34589 1727204103.14095: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 34589 1727204103.14180: no more pending results, returning what we have 34589 1727204103.14183: results queue empty 34589 1727204103.14184: checking for any_errors_fatal 34589 1727204103.14190: done checking for any_errors_fatal 34589 1727204103.14190: checking for max_fail_percentage 34589 1727204103.14192: done checking for max_fail_percentage 34589 1727204103.14193: checking to see if all hosts have failed and the running result is not ok 34589 1727204103.14194: done checking to see if all hosts have failed 34589 1727204103.14195: getting the remaining hosts for this loop 34589 1727204103.14196: done getting the remaining hosts for this loop 34589 1727204103.14199: getting the next task for host managed-node1 34589 1727204103.14211: done getting next task for host managed-node1 34589 1727204103.14214: ^ task is: TASK: Fix CentOS6 Base repo 34589 1727204103.14217: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204103.14221: getting variables 34589 1727204103.14223: in VariableManager get_vars() 34589 1727204103.14399: Calling all_inventory to load vars for managed-node1 34589 1727204103.14402: Calling groups_inventory to load vars for managed-node1 34589 1727204103.14405: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204103.14417: Calling all_plugins_play to load vars for managed-node1 34589 1727204103.14420: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204103.14429: Calling groups_plugins_play to load vars for managed-node1 34589 1727204103.14718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204103.14925: done with get_vars() 34589 1727204103.14935: done getting variables 34589 1727204103.15057: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 14:55:03 -0400 (0:00:00.042) 0:00:03.285 ***** 34589 1727204103.15086: entering _queue_task() for managed-node1/copy 34589 1727204103.15480: worker is 1 (out of 1 available) 34589 1727204103.15490: exiting _queue_task() for managed-node1/copy 34589 1727204103.15500: done queuing things up, now waiting for results queue to drain 34589 1727204103.15501: waiting for pending results... 34589 1727204103.15652: running TaskExecutor() for managed-node1/TASK: Fix CentOS6 Base repo 34589 1727204103.15783: in run() - task 028d2410-947f-a9c6-cddc-0000000000b9 34589 1727204103.15787: variable 'ansible_search_path' from source: unknown 34589 1727204103.15789: variable 'ansible_search_path' from source: unknown 34589 1727204103.15888: calling self._execute() 34589 1727204103.15916: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204103.15926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204103.15940: variable 'omit' from source: magic vars 34589 1727204103.16424: variable 'ansible_distribution' from source: facts 34589 1727204103.16458: Evaluated conditional (ansible_distribution == 'CentOS'): True 34589 1727204103.16589: variable 'ansible_distribution_major_version' from source: facts 34589 1727204103.16599: Evaluated conditional (ansible_distribution_major_version == '6'): False 34589 1727204103.16608: when evaluation is False, skipping this task 34589 1727204103.16652: _execute() done 34589 1727204103.16655: dumping result to json 34589 1727204103.16657: done dumping result, returning 34589 1727204103.16660: done running TaskExecutor() for managed-node1/TASK: Fix CentOS6 Base repo [028d2410-947f-a9c6-cddc-0000000000b9] 34589 1727204103.16662: sending task result for task 028d2410-947f-a9c6-cddc-0000000000b9 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 34589 1727204103.16823: no more pending results, returning what we have 34589 1727204103.16826: results queue empty 34589 1727204103.16827: checking for any_errors_fatal 34589 1727204103.16832: done checking for any_errors_fatal 34589 1727204103.16833: checking for max_fail_percentage 34589 1727204103.16835: done checking for max_fail_percentage 34589 1727204103.16836: checking to see if all hosts have failed and the running result is not ok 34589 1727204103.16837: done checking to see if all hosts have failed 34589 1727204103.16838: getting the remaining hosts for this loop 34589 1727204103.16839: done getting the remaining hosts for this loop 34589 1727204103.16843: getting the next task for host managed-node1 34589 1727204103.16850: done getting next task for host managed-node1 34589 1727204103.16852: ^ task is: TASK: Include the task 'enable_epel.yml' 34589 1727204103.16855: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204103.16860: getting variables 34589 1727204103.16861: in VariableManager get_vars() 34589 1727204103.16893: Calling all_inventory to load vars for managed-node1 34589 1727204103.16896: Calling groups_inventory to load vars for managed-node1 34589 1727204103.16900: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204103.16915: Calling all_plugins_play to load vars for managed-node1 34589 1727204103.16918: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204103.16922: Calling groups_plugins_play to load vars for managed-node1 34589 1727204103.17379: done sending task result for task 028d2410-947f-a9c6-cddc-0000000000b9 34589 1727204103.17382: WORKER PROCESS EXITING 34589 1727204103.17408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204103.17599: done with get_vars() 34589 1727204103.17611: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 14:55:03 -0400 (0:00:00.026) 0:00:03.311 ***** 34589 1727204103.17703: entering _queue_task() for managed-node1/include_tasks 34589 1727204103.18021: worker is 1 (out of 1 available) 34589 1727204103.18034: exiting _queue_task() for managed-node1/include_tasks 34589 1727204103.18046: done queuing things up, now waiting for results queue to drain 34589 1727204103.18048: waiting for pending results... 34589 1727204103.18331: running TaskExecutor() for managed-node1/TASK: Include the task 'enable_epel.yml' 34589 1727204103.18442: in run() - task 028d2410-947f-a9c6-cddc-0000000000ba 34589 1727204103.18461: variable 'ansible_search_path' from source: unknown 34589 1727204103.18469: variable 'ansible_search_path' from source: unknown 34589 1727204103.18520: calling self._execute() 34589 1727204103.18613: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204103.18626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204103.18643: variable 'omit' from source: magic vars 34589 1727204103.19172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204103.21635: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204103.21722: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204103.21764: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204103.21814: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204103.21843: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204103.21938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204103.21972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204103.22013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204103.22180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204103.22184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204103.22202: variable '__network_is_ostree' from source: set_fact 34589 1727204103.22226: Evaluated conditional (not __network_is_ostree | d(false)): True 34589 1727204103.22236: _execute() done 34589 1727204103.22243: dumping result to json 34589 1727204103.22249: done dumping result, returning 34589 1727204103.22259: done running TaskExecutor() for managed-node1/TASK: Include the task 'enable_epel.yml' [028d2410-947f-a9c6-cddc-0000000000ba] 34589 1727204103.22267: sending task result for task 028d2410-947f-a9c6-cddc-0000000000ba 34589 1727204103.22437: no more pending results, returning what we have 34589 1727204103.22442: in VariableManager get_vars() 34589 1727204103.22481: Calling all_inventory to load vars for managed-node1 34589 1727204103.22484: Calling groups_inventory to load vars for managed-node1 34589 1727204103.22489: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204103.22501: Calling all_plugins_play to load vars for managed-node1 34589 1727204103.22504: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204103.22511: Calling groups_plugins_play to load vars for managed-node1 34589 1727204103.22908: done sending task result for task 028d2410-947f-a9c6-cddc-0000000000ba 34589 1727204103.22912: WORKER PROCESS EXITING 34589 1727204103.22935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204103.23222: done with get_vars() 34589 1727204103.23230: variable 'ansible_search_path' from source: unknown 34589 1727204103.23231: variable 'ansible_search_path' from source: unknown 34589 1727204103.23273: we have included files to process 34589 1727204103.23276: generating all_blocks data 34589 1727204103.23278: done generating all_blocks data 34589 1727204103.23284: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 34589 1727204103.23285: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 34589 1727204103.23289: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 34589 1727204103.24041: done processing included file 34589 1727204103.24043: iterating over new_blocks loaded from include file 34589 1727204103.24044: in VariableManager get_vars() 34589 1727204103.24057: done with get_vars() 34589 1727204103.24059: filtering new block on tags 34589 1727204103.24082: done filtering new block on tags 34589 1727204103.24085: in VariableManager get_vars() 34589 1727204103.24095: done with get_vars() 34589 1727204103.24096: filtering new block on tags 34589 1727204103.24110: done filtering new block on tags 34589 1727204103.24112: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node1 34589 1727204103.24117: extending task lists for all hosts with included blocks 34589 1727204103.24226: done extending task lists 34589 1727204103.24227: done processing included files 34589 1727204103.24228: results queue empty 34589 1727204103.24229: checking for any_errors_fatal 34589 1727204103.24232: done checking for any_errors_fatal 34589 1727204103.24233: checking for max_fail_percentage 34589 1727204103.24234: done checking for max_fail_percentage 34589 1727204103.24235: checking to see if all hosts have failed and the running result is not ok 34589 1727204103.24235: done checking to see if all hosts have failed 34589 1727204103.24236: getting the remaining hosts for this loop 34589 1727204103.24237: done getting the remaining hosts for this loop 34589 1727204103.24239: getting the next task for host managed-node1 34589 1727204103.24248: done getting next task for host managed-node1 34589 1727204103.24250: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 34589 1727204103.24253: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204103.24255: getting variables 34589 1727204103.24256: in VariableManager get_vars() 34589 1727204103.24264: Calling all_inventory to load vars for managed-node1 34589 1727204103.24267: Calling groups_inventory to load vars for managed-node1 34589 1727204103.24269: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204103.24274: Calling all_plugins_play to load vars for managed-node1 34589 1727204103.24284: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204103.24287: Calling groups_plugins_play to load vars for managed-node1 34589 1727204103.24492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204103.24750: done with get_vars() 34589 1727204103.24759: done getting variables 34589 1727204103.24848: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 34589 1727204103.25092: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 14:55:03 -0400 (0:00:00.074) 0:00:03.386 ***** 34589 1727204103.25145: entering _queue_task() for managed-node1/command 34589 1727204103.25147: Creating lock for command 34589 1727204103.25723: worker is 1 (out of 1 available) 34589 1727204103.25733: exiting _queue_task() for managed-node1/command 34589 1727204103.25743: done queuing things up, now waiting for results queue to drain 34589 1727204103.25744: waiting for pending results... 34589 1727204103.25928: running TaskExecutor() for managed-node1/TASK: Create EPEL 10 34589 1727204103.26039: in run() - task 028d2410-947f-a9c6-cddc-0000000000d4 34589 1727204103.26057: variable 'ansible_search_path' from source: unknown 34589 1727204103.26064: variable 'ansible_search_path' from source: unknown 34589 1727204103.26115: calling self._execute() 34589 1727204103.26195: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204103.26213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204103.26229: variable 'omit' from source: magic vars 34589 1727204103.26760: variable 'ansible_distribution' from source: facts 34589 1727204103.26764: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 34589 1727204103.26858: variable 'ansible_distribution_major_version' from source: facts 34589 1727204103.26878: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 34589 1727204103.26885: when evaluation is False, skipping this task 34589 1727204103.26891: _execute() done 34589 1727204103.26898: dumping result to json 34589 1727204103.26983: done dumping result, returning 34589 1727204103.26987: done running TaskExecutor() for managed-node1/TASK: Create EPEL 10 [028d2410-947f-a9c6-cddc-0000000000d4] 34589 1727204103.26989: sending task result for task 028d2410-947f-a9c6-cddc-0000000000d4 34589 1727204103.27066: done sending task result for task 028d2410-947f-a9c6-cddc-0000000000d4 34589 1727204103.27069: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 34589 1727204103.27144: no more pending results, returning what we have 34589 1727204103.27147: results queue empty 34589 1727204103.27148: checking for any_errors_fatal 34589 1727204103.27150: done checking for any_errors_fatal 34589 1727204103.27150: checking for max_fail_percentage 34589 1727204103.27152: done checking for max_fail_percentage 34589 1727204103.27153: checking to see if all hosts have failed and the running result is not ok 34589 1727204103.27153: done checking to see if all hosts have failed 34589 1727204103.27154: getting the remaining hosts for this loop 34589 1727204103.27155: done getting the remaining hosts for this loop 34589 1727204103.27159: getting the next task for host managed-node1 34589 1727204103.27166: done getting next task for host managed-node1 34589 1727204103.27169: ^ task is: TASK: Install yum-utils package 34589 1727204103.27172: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204103.27179: getting variables 34589 1727204103.27181: in VariableManager get_vars() 34589 1727204103.27216: Calling all_inventory to load vars for managed-node1 34589 1727204103.27219: Calling groups_inventory to load vars for managed-node1 34589 1727204103.27223: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204103.27238: Calling all_plugins_play to load vars for managed-node1 34589 1727204103.27242: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204103.27245: Calling groups_plugins_play to load vars for managed-node1 34589 1727204103.27690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204103.27907: done with get_vars() 34589 1727204103.27922: done getting variables 34589 1727204103.28027: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 14:55:03 -0400 (0:00:00.029) 0:00:03.415 ***** 34589 1727204103.28057: entering _queue_task() for managed-node1/package 34589 1727204103.28058: Creating lock for package 34589 1727204103.28470: worker is 1 (out of 1 available) 34589 1727204103.28483: exiting _queue_task() for managed-node1/package 34589 1727204103.28492: done queuing things up, now waiting for results queue to drain 34589 1727204103.28493: waiting for pending results... 34589 1727204103.28716: running TaskExecutor() for managed-node1/TASK: Install yum-utils package 34589 1727204103.28899: in run() - task 028d2410-947f-a9c6-cddc-0000000000d5 34589 1727204103.28921: variable 'ansible_search_path' from source: unknown 34589 1727204103.28928: variable 'ansible_search_path' from source: unknown 34589 1727204103.29083: calling self._execute() 34589 1727204103.29349: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204103.29358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204103.29421: variable 'omit' from source: magic vars 34589 1727204103.30226: variable 'ansible_distribution' from source: facts 34589 1727204103.30244: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 34589 1727204103.30542: variable 'ansible_distribution_major_version' from source: facts 34589 1727204103.30595: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 34589 1727204103.30604: when evaluation is False, skipping this task 34589 1727204103.30622: _execute() done 34589 1727204103.30631: dumping result to json 34589 1727204103.30640: done dumping result, returning 34589 1727204103.30841: done running TaskExecutor() for managed-node1/TASK: Install yum-utils package [028d2410-947f-a9c6-cddc-0000000000d5] 34589 1727204103.30845: sending task result for task 028d2410-947f-a9c6-cddc-0000000000d5 34589 1727204103.30924: done sending task result for task 028d2410-947f-a9c6-cddc-0000000000d5 34589 1727204103.30927: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 34589 1727204103.31060: no more pending results, returning what we have 34589 1727204103.31064: results queue empty 34589 1727204103.31064: checking for any_errors_fatal 34589 1727204103.31068: done checking for any_errors_fatal 34589 1727204103.31069: checking for max_fail_percentage 34589 1727204103.31071: done checking for max_fail_percentage 34589 1727204103.31072: checking to see if all hosts have failed and the running result is not ok 34589 1727204103.31072: done checking to see if all hosts have failed 34589 1727204103.31073: getting the remaining hosts for this loop 34589 1727204103.31074: done getting the remaining hosts for this loop 34589 1727204103.31080: getting the next task for host managed-node1 34589 1727204103.31087: done getting next task for host managed-node1 34589 1727204103.31090: ^ task is: TASK: Enable EPEL 7 34589 1727204103.31094: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204103.31099: getting variables 34589 1727204103.31101: in VariableManager get_vars() 34589 1727204103.31132: Calling all_inventory to load vars for managed-node1 34589 1727204103.31135: Calling groups_inventory to load vars for managed-node1 34589 1727204103.31139: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204103.31153: Calling all_plugins_play to load vars for managed-node1 34589 1727204103.31156: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204103.31159: Calling groups_plugins_play to load vars for managed-node1 34589 1727204103.31730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204103.32216: done with get_vars() 34589 1727204103.32229: done getting variables 34589 1727204103.32404: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 14:55:03 -0400 (0:00:00.043) 0:00:03.459 ***** 34589 1727204103.32438: entering _queue_task() for managed-node1/command 34589 1727204103.32877: worker is 1 (out of 1 available) 34589 1727204103.32890: exiting _queue_task() for managed-node1/command 34589 1727204103.32912: done queuing things up, now waiting for results queue to drain 34589 1727204103.32914: waiting for pending results... 34589 1727204103.33287: running TaskExecutor() for managed-node1/TASK: Enable EPEL 7 34589 1727204103.33293: in run() - task 028d2410-947f-a9c6-cddc-0000000000d6 34589 1727204103.33316: variable 'ansible_search_path' from source: unknown 34589 1727204103.33324: variable 'ansible_search_path' from source: unknown 34589 1727204103.33371: calling self._execute() 34589 1727204103.33455: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204103.33472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204103.33491: variable 'omit' from source: magic vars 34589 1727204103.34124: variable 'ansible_distribution' from source: facts 34589 1727204103.34130: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 34589 1727204103.34214: variable 'ansible_distribution_major_version' from source: facts 34589 1727204103.34225: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 34589 1727204103.34274: when evaluation is False, skipping this task 34589 1727204103.34341: _execute() done 34589 1727204103.34345: dumping result to json 34589 1727204103.34449: done dumping result, returning 34589 1727204103.34453: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 7 [028d2410-947f-a9c6-cddc-0000000000d6] 34589 1727204103.34456: sending task result for task 028d2410-947f-a9c6-cddc-0000000000d6 34589 1727204103.34540: done sending task result for task 028d2410-947f-a9c6-cddc-0000000000d6 34589 1727204103.34544: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 34589 1727204103.34618: no more pending results, returning what we have 34589 1727204103.34622: results queue empty 34589 1727204103.34623: checking for any_errors_fatal 34589 1727204103.34634: done checking for any_errors_fatal 34589 1727204103.34635: checking for max_fail_percentage 34589 1727204103.34637: done checking for max_fail_percentage 34589 1727204103.34637: checking to see if all hosts have failed and the running result is not ok 34589 1727204103.34638: done checking to see if all hosts have failed 34589 1727204103.34639: getting the remaining hosts for this loop 34589 1727204103.34640: done getting the remaining hosts for this loop 34589 1727204103.34646: getting the next task for host managed-node1 34589 1727204103.34658: done getting next task for host managed-node1 34589 1727204103.34662: ^ task is: TASK: Enable EPEL 8 34589 1727204103.34666: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204103.34670: getting variables 34589 1727204103.34672: in VariableManager get_vars() 34589 1727204103.34714: Calling all_inventory to load vars for managed-node1 34589 1727204103.34717: Calling groups_inventory to load vars for managed-node1 34589 1727204103.34722: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204103.35220: Calling all_plugins_play to load vars for managed-node1 34589 1727204103.35224: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204103.35228: Calling groups_plugins_play to load vars for managed-node1 34589 1727204103.35786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204103.36242: done with get_vars() 34589 1727204103.36254: done getting variables 34589 1727204103.36431: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 14:55:03 -0400 (0:00:00.040) 0:00:03.499 ***** 34589 1727204103.36461: entering _queue_task() for managed-node1/command 34589 1727204103.36949: worker is 1 (out of 1 available) 34589 1727204103.36958: exiting _queue_task() for managed-node1/command 34589 1727204103.36969: done queuing things up, now waiting for results queue to drain 34589 1727204103.36970: waiting for pending results... 34589 1727204103.37293: running TaskExecutor() for managed-node1/TASK: Enable EPEL 8 34589 1727204103.37298: in run() - task 028d2410-947f-a9c6-cddc-0000000000d7 34589 1727204103.37366: variable 'ansible_search_path' from source: unknown 34589 1727204103.37369: variable 'ansible_search_path' from source: unknown 34589 1727204103.37371: calling self._execute() 34589 1727204103.37435: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204103.37445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204103.37457: variable 'omit' from source: magic vars 34589 1727204103.37849: variable 'ansible_distribution' from source: facts 34589 1727204103.37864: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 34589 1727204103.37997: variable 'ansible_distribution_major_version' from source: facts 34589 1727204103.38014: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 34589 1727204103.38083: when evaluation is False, skipping this task 34589 1727204103.38086: _execute() done 34589 1727204103.38089: dumping result to json 34589 1727204103.38091: done dumping result, returning 34589 1727204103.38094: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 8 [028d2410-947f-a9c6-cddc-0000000000d7] 34589 1727204103.38096: sending task result for task 028d2410-947f-a9c6-cddc-0000000000d7 34589 1727204103.38168: done sending task result for task 028d2410-947f-a9c6-cddc-0000000000d7 34589 1727204103.38171: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 34589 1727204103.38236: no more pending results, returning what we have 34589 1727204103.38240: results queue empty 34589 1727204103.38241: checking for any_errors_fatal 34589 1727204103.38247: done checking for any_errors_fatal 34589 1727204103.38248: checking for max_fail_percentage 34589 1727204103.38250: done checking for max_fail_percentage 34589 1727204103.38251: checking to see if all hosts have failed and the running result is not ok 34589 1727204103.38251: done checking to see if all hosts have failed 34589 1727204103.38252: getting the remaining hosts for this loop 34589 1727204103.38253: done getting the remaining hosts for this loop 34589 1727204103.38257: getting the next task for host managed-node1 34589 1727204103.38266: done getting next task for host managed-node1 34589 1727204103.38269: ^ task is: TASK: Enable EPEL 6 34589 1727204103.38273: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204103.38281: getting variables 34589 1727204103.38283: in VariableManager get_vars() 34589 1727204103.38318: Calling all_inventory to load vars for managed-node1 34589 1727204103.38321: Calling groups_inventory to load vars for managed-node1 34589 1727204103.38325: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204103.38339: Calling all_plugins_play to load vars for managed-node1 34589 1727204103.38342: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204103.38346: Calling groups_plugins_play to load vars for managed-node1 34589 1727204103.38669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204103.39050: done with get_vars() 34589 1727204103.39136: done getting variables 34589 1727204103.39313: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 14:55:03 -0400 (0:00:00.028) 0:00:03.528 ***** 34589 1727204103.39343: entering _queue_task() for managed-node1/copy 34589 1727204103.39967: worker is 1 (out of 1 available) 34589 1727204103.40079: exiting _queue_task() for managed-node1/copy 34589 1727204103.40091: done queuing things up, now waiting for results queue to drain 34589 1727204103.40093: waiting for pending results... 34589 1727204103.40671: running TaskExecutor() for managed-node1/TASK: Enable EPEL 6 34589 1727204103.40800: in run() - task 028d2410-947f-a9c6-cddc-0000000000d9 34589 1727204103.40948: variable 'ansible_search_path' from source: unknown 34589 1727204103.40970: variable 'ansible_search_path' from source: unknown 34589 1727204103.41015: calling self._execute() 34589 1727204103.41191: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204103.41195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204103.41198: variable 'omit' from source: magic vars 34589 1727204103.41580: variable 'ansible_distribution' from source: facts 34589 1727204103.41599: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 34589 1727204103.41733: variable 'ansible_distribution_major_version' from source: facts 34589 1727204103.41745: Evaluated conditional (ansible_distribution_major_version == '6'): False 34589 1727204103.41756: when evaluation is False, skipping this task 34589 1727204103.41765: _execute() done 34589 1727204103.41774: dumping result to json 34589 1727204103.41783: done dumping result, returning 34589 1727204103.41843: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 6 [028d2410-947f-a9c6-cddc-0000000000d9] 34589 1727204103.41846: sending task result for task 028d2410-947f-a9c6-cddc-0000000000d9 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 34589 1727204103.42222: no more pending results, returning what we have 34589 1727204103.42226: results queue empty 34589 1727204103.42226: checking for any_errors_fatal 34589 1727204103.42234: done checking for any_errors_fatal 34589 1727204103.42234: checking for max_fail_percentage 34589 1727204103.42236: done checking for max_fail_percentage 34589 1727204103.42237: checking to see if all hosts have failed and the running result is not ok 34589 1727204103.42238: done checking to see if all hosts have failed 34589 1727204103.42238: getting the remaining hosts for this loop 34589 1727204103.42240: done getting the remaining hosts for this loop 34589 1727204103.42244: getting the next task for host managed-node1 34589 1727204103.42254: done getting next task for host managed-node1 34589 1727204103.42256: ^ task is: TASK: Set network provider to 'nm' 34589 1727204103.42259: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204103.42264: getting variables 34589 1727204103.42265: in VariableManager get_vars() 34589 1727204103.42416: Calling all_inventory to load vars for managed-node1 34589 1727204103.42419: Calling groups_inventory to load vars for managed-node1 34589 1727204103.42423: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204103.42436: Calling all_plugins_play to load vars for managed-node1 34589 1727204103.42439: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204103.42442: Calling groups_plugins_play to load vars for managed-node1 34589 1727204103.43134: done sending task result for task 028d2410-947f-a9c6-cddc-0000000000d9 34589 1727204103.43137: WORKER PROCESS EXITING 34589 1727204103.44244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204103.44768: done with get_vars() 34589 1727204103.44879: done getting variables 34589 1727204103.45000: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml:13 Tuesday 24 September 2024 14:55:03 -0400 (0:00:00.057) 0:00:03.585 ***** 34589 1727204103.45092: entering _queue_task() for managed-node1/set_fact 34589 1727204103.45765: worker is 1 (out of 1 available) 34589 1727204103.46478: exiting _queue_task() for managed-node1/set_fact 34589 1727204103.46491: done queuing things up, now waiting for results queue to drain 34589 1727204103.46492: waiting for pending results... 34589 1727204103.46739: running TaskExecutor() for managed-node1/TASK: Set network provider to 'nm' 34589 1727204103.47298: in run() - task 028d2410-947f-a9c6-cddc-000000000007 34589 1727204103.47303: variable 'ansible_search_path' from source: unknown 34589 1727204103.47305: calling self._execute() 34589 1727204103.47413: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204103.47516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204103.47535: variable 'omit' from source: magic vars 34589 1727204103.47777: variable 'omit' from source: magic vars 34589 1727204103.47814: variable 'omit' from source: magic vars 34589 1727204103.47862: variable 'omit' from source: magic vars 34589 1727204103.47942: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204103.48105: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204103.48135: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204103.48191: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204103.48235: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204103.48313: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204103.48335: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204103.48436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204103.48783: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204103.48960: Set connection var ansible_shell_executable to /bin/sh 34589 1727204103.48963: Set connection var ansible_timeout to 10 34589 1727204103.48966: Set connection var ansible_shell_type to sh 34589 1727204103.48968: Set connection var ansible_connection to ssh 34589 1727204103.48970: Set connection var ansible_pipelining to False 34589 1727204103.49087: variable 'ansible_shell_executable' from source: unknown 34589 1727204103.49096: variable 'ansible_connection' from source: unknown 34589 1727204103.49105: variable 'ansible_module_compression' from source: unknown 34589 1727204103.49112: variable 'ansible_shell_type' from source: unknown 34589 1727204103.49123: variable 'ansible_shell_executable' from source: unknown 34589 1727204103.49148: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204103.49283: variable 'ansible_pipelining' from source: unknown 34589 1727204103.49287: variable 'ansible_timeout' from source: unknown 34589 1727204103.49289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204103.49520: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204103.49570: variable 'omit' from source: magic vars 34589 1727204103.49618: starting attempt loop 34589 1727204103.49625: running the handler 34589 1727204103.49640: handler run complete 34589 1727204103.49655: attempt loop complete, returning result 34589 1727204103.49773: _execute() done 34589 1727204103.49777: dumping result to json 34589 1727204103.49780: done dumping result, returning 34589 1727204103.49782: done running TaskExecutor() for managed-node1/TASK: Set network provider to 'nm' [028d2410-947f-a9c6-cddc-000000000007] 34589 1727204103.49785: sending task result for task 028d2410-947f-a9c6-cddc-000000000007 ok: [managed-node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 34589 1727204103.50028: no more pending results, returning what we have 34589 1727204103.50032: results queue empty 34589 1727204103.50032: checking for any_errors_fatal 34589 1727204103.50039: done checking for any_errors_fatal 34589 1727204103.50040: checking for max_fail_percentage 34589 1727204103.50042: done checking for max_fail_percentage 34589 1727204103.50043: checking to see if all hosts have failed and the running result is not ok 34589 1727204103.50044: done checking to see if all hosts have failed 34589 1727204103.50044: getting the remaining hosts for this loop 34589 1727204103.50045: done getting the remaining hosts for this loop 34589 1727204103.50050: getting the next task for host managed-node1 34589 1727204103.50057: done getting next task for host managed-node1 34589 1727204103.50059: ^ task is: TASK: meta (flush_handlers) 34589 1727204103.50060: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204103.50065: getting variables 34589 1727204103.50067: in VariableManager get_vars() 34589 1727204103.50103: Calling all_inventory to load vars for managed-node1 34589 1727204103.50107: Calling groups_inventory to load vars for managed-node1 34589 1727204103.50111: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204103.50122: Calling all_plugins_play to load vars for managed-node1 34589 1727204103.50125: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204103.50128: Calling groups_plugins_play to load vars for managed-node1 34589 1727204103.50810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204103.51306: done with get_vars() 34589 1727204103.51319: done getting variables 34589 1727204103.51437: done sending task result for task 028d2410-947f-a9c6-cddc-000000000007 34589 1727204103.51441: WORKER PROCESS EXITING 34589 1727204103.51611: in VariableManager get_vars() 34589 1727204103.51622: Calling all_inventory to load vars for managed-node1 34589 1727204103.51625: Calling groups_inventory to load vars for managed-node1 34589 1727204103.51627: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204103.51633: Calling all_plugins_play to load vars for managed-node1 34589 1727204103.51635: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204103.51638: Calling groups_plugins_play to load vars for managed-node1 34589 1727204103.52134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204103.52679: done with get_vars() 34589 1727204103.52697: done queuing things up, now waiting for results queue to drain 34589 1727204103.52701: results queue empty 34589 1727204103.52701: checking for any_errors_fatal 34589 1727204103.52704: done checking for any_errors_fatal 34589 1727204103.52704: checking for max_fail_percentage 34589 1727204103.52705: done checking for max_fail_percentage 34589 1727204103.52706: checking to see if all hosts have failed and the running result is not ok 34589 1727204103.52707: done checking to see if all hosts have failed 34589 1727204103.52708: getting the remaining hosts for this loop 34589 1727204103.52708: done getting the remaining hosts for this loop 34589 1727204103.52711: getting the next task for host managed-node1 34589 1727204103.52715: done getting next task for host managed-node1 34589 1727204103.52717: ^ task is: TASK: meta (flush_handlers) 34589 1727204103.52718: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204103.52727: getting variables 34589 1727204103.52728: in VariableManager get_vars() 34589 1727204103.52738: Calling all_inventory to load vars for managed-node1 34589 1727204103.52741: Calling groups_inventory to load vars for managed-node1 34589 1727204103.52743: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204103.52749: Calling all_plugins_play to load vars for managed-node1 34589 1727204103.52751: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204103.52754: Calling groups_plugins_play to load vars for managed-node1 34589 1727204103.53124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204103.53733: done with get_vars() 34589 1727204103.53744: done getting variables 34589 1727204103.54009: in VariableManager get_vars() 34589 1727204103.54020: Calling all_inventory to load vars for managed-node1 34589 1727204103.54023: Calling groups_inventory to load vars for managed-node1 34589 1727204103.54025: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204103.54030: Calling all_plugins_play to load vars for managed-node1 34589 1727204103.54032: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204103.54035: Calling groups_plugins_play to load vars for managed-node1 34589 1727204103.54429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204103.54883: done with get_vars() 34589 1727204103.54898: done queuing things up, now waiting for results queue to drain 34589 1727204103.54900: results queue empty 34589 1727204103.54901: checking for any_errors_fatal 34589 1727204103.54902: done checking for any_errors_fatal 34589 1727204103.54903: checking for max_fail_percentage 34589 1727204103.54904: done checking for max_fail_percentage 34589 1727204103.54905: checking to see if all hosts have failed and the running result is not ok 34589 1727204103.54905: done checking to see if all hosts have failed 34589 1727204103.54906: getting the remaining hosts for this loop 34589 1727204103.54907: done getting the remaining hosts for this loop 34589 1727204103.54910: getting the next task for host managed-node1 34589 1727204103.54913: done getting next task for host managed-node1 34589 1727204103.54914: ^ task is: None 34589 1727204103.54916: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204103.54917: done queuing things up, now waiting for results queue to drain 34589 1727204103.54918: results queue empty 34589 1727204103.54919: checking for any_errors_fatal 34589 1727204103.54919: done checking for any_errors_fatal 34589 1727204103.54920: checking for max_fail_percentage 34589 1727204103.54921: done checking for max_fail_percentage 34589 1727204103.54921: checking to see if all hosts have failed and the running result is not ok 34589 1727204103.54922: done checking to see if all hosts have failed 34589 1727204103.54924: getting the next task for host managed-node1 34589 1727204103.54927: done getting next task for host managed-node1 34589 1727204103.54928: ^ task is: None 34589 1727204103.54929: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204103.55102: in VariableManager get_vars() 34589 1727204103.55126: done with get_vars() 34589 1727204103.55133: in VariableManager get_vars() 34589 1727204103.55145: done with get_vars() 34589 1727204103.55150: variable 'omit' from source: magic vars 34589 1727204103.55289: in VariableManager get_vars() 34589 1727204103.55308: done with get_vars() 34589 1727204103.55330: variable 'omit' from source: magic vars PLAY [Play for testing ipv6 disabled] ****************************************** 34589 1727204103.56059: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 34589 1727204103.56093: getting the remaining hosts for this loop 34589 1727204103.56094: done getting the remaining hosts for this loop 34589 1727204103.56097: getting the next task for host managed-node1 34589 1727204103.56099: done getting next task for host managed-node1 34589 1727204103.56101: ^ task is: TASK: Gathering Facts 34589 1727204103.56103: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204103.56104: getting variables 34589 1727204103.56105: in VariableManager get_vars() 34589 1727204103.56116: Calling all_inventory to load vars for managed-node1 34589 1727204103.56118: Calling groups_inventory to load vars for managed-node1 34589 1727204103.56120: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204103.56125: Calling all_plugins_play to load vars for managed-node1 34589 1727204103.56138: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204103.56141: Calling groups_plugins_play to load vars for managed-node1 34589 1727204103.56398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204103.56795: done with get_vars() 34589 1727204103.56805: done getting variables 34589 1727204103.56850: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:3 Tuesday 24 September 2024 14:55:03 -0400 (0:00:00.117) 0:00:03.703 ***** 34589 1727204103.56878: entering _queue_task() for managed-node1/gather_facts 34589 1727204103.57668: worker is 1 (out of 1 available) 34589 1727204103.57680: exiting _queue_task() for managed-node1/gather_facts 34589 1727204103.57690: done queuing things up, now waiting for results queue to drain 34589 1727204103.57691: waiting for pending results... 34589 1727204103.57975: running TaskExecutor() for managed-node1/TASK: Gathering Facts 34589 1727204103.58163: in run() - task 028d2410-947f-a9c6-cddc-0000000000ff 34589 1727204103.58180: variable 'ansible_search_path' from source: unknown 34589 1727204103.58366: calling self._execute() 34589 1727204103.58543: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204103.58546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204103.58549: variable 'omit' from source: magic vars 34589 1727204103.59133: variable 'ansible_distribution_major_version' from source: facts 34589 1727204103.59310: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204103.59313: variable 'omit' from source: magic vars 34589 1727204103.59316: variable 'omit' from source: magic vars 34589 1727204103.59318: variable 'omit' from source: magic vars 34589 1727204103.59582: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204103.59586: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204103.59611: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204103.59639: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204103.59656: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204103.59849: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204103.59853: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204103.59855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204103.60012: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204103.60027: Set connection var ansible_shell_executable to /bin/sh 34589 1727204103.60079: Set connection var ansible_timeout to 10 34589 1727204103.60087: Set connection var ansible_shell_type to sh 34589 1727204103.60100: Set connection var ansible_connection to ssh 34589 1727204103.60114: Set connection var ansible_pipelining to False 34589 1727204103.60201: variable 'ansible_shell_executable' from source: unknown 34589 1727204103.60215: variable 'ansible_connection' from source: unknown 34589 1727204103.60223: variable 'ansible_module_compression' from source: unknown 34589 1727204103.60231: variable 'ansible_shell_type' from source: unknown 34589 1727204103.60237: variable 'ansible_shell_executable' from source: unknown 34589 1727204103.60264: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204103.60282: variable 'ansible_pipelining' from source: unknown 34589 1727204103.60485: variable 'ansible_timeout' from source: unknown 34589 1727204103.60489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204103.60717: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204103.60721: variable 'omit' from source: magic vars 34589 1727204103.60725: starting attempt loop 34589 1727204103.60728: running the handler 34589 1727204103.60730: variable 'ansible_facts' from source: unknown 34589 1727204103.60733: _low_level_execute_command(): starting 34589 1727204103.60736: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204103.61555: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204103.61592: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204103.61705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204103.63620: stdout chunk (state=3): >>>/root <<< 34589 1727204103.63715: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204103.63741: stdout chunk (state=3): >>><<< 34589 1727204103.63748: stderr chunk (state=3): >>><<< 34589 1727204103.63919: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204103.63923: _low_level_execute_command(): starting 34589 1727204103.63927: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204103.6381688-35295-181615325217309 `" && echo ansible-tmp-1727204103.6381688-35295-181615325217309="` echo /root/.ansible/tmp/ansible-tmp-1727204103.6381688-35295-181615325217309 `" ) && sleep 0' 34589 1727204103.64585: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204103.64600: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204103.64660: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 34589 1727204103.64670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204103.64735: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204103.64743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204103.64828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204103.66935: stdout chunk (state=3): >>>ansible-tmp-1727204103.6381688-35295-181615325217309=/root/.ansible/tmp/ansible-tmp-1727204103.6381688-35295-181615325217309 <<< 34589 1727204103.67072: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204103.67079: stdout chunk (state=3): >>><<< 34589 1727204103.67083: stderr chunk (state=3): >>><<< 34589 1727204103.67283: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204103.6381688-35295-181615325217309=/root/.ansible/tmp/ansible-tmp-1727204103.6381688-35295-181615325217309 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204103.67286: variable 'ansible_module_compression' from source: unknown 34589 1727204103.67288: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 34589 1727204103.67290: variable 'ansible_facts' from source: unknown 34589 1727204103.67510: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204103.6381688-35295-181615325217309/AnsiballZ_setup.py 34589 1727204103.67720: Sending initial data 34589 1727204103.67733: Sent initial data (154 bytes) 34589 1727204103.68203: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204103.68224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204103.68236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204103.68321: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204103.68334: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204103.68420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204103.70167: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204103.70239: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204103.70316: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmpw56iqq5g /root/.ansible/tmp/ansible-tmp-1727204103.6381688-35295-181615325217309/AnsiballZ_setup.py <<< 34589 1727204103.70320: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204103.6381688-35295-181615325217309/AnsiballZ_setup.py" <<< 34589 1727204103.70396: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmpw56iqq5g" to remote "/root/.ansible/tmp/ansible-tmp-1727204103.6381688-35295-181615325217309/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204103.6381688-35295-181615325217309/AnsiballZ_setup.py" <<< 34589 1727204103.71900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204103.71939: stderr chunk (state=3): >>><<< 34589 1727204103.71943: stdout chunk (state=3): >>><<< 34589 1727204103.71961: done transferring module to remote 34589 1727204103.71977: _low_level_execute_command(): starting 34589 1727204103.71981: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204103.6381688-35295-181615325217309/ /root/.ansible/tmp/ansible-tmp-1727204103.6381688-35295-181615325217309/AnsiballZ_setup.py && sleep 0' 34589 1727204103.72507: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204103.72543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204103.72548: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204103.72551: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204103.72585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204103.72638: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204103.74783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204103.74786: stdout chunk (state=3): >>><<< 34589 1727204103.74789: stderr chunk (state=3): >>><<< 34589 1727204103.74791: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204103.74881: _low_level_execute_command(): starting 34589 1727204103.74884: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204103.6381688-35295-181615325217309/AnsiballZ_setup.py && sleep 0' 34589 1727204103.75696: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204103.75724: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204103.75760: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204103.75836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204104.43996: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.7314453125, "5m": 0.54052734375, "15m": 0.28271484375}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processo<<< 34589 1727204104.44115: stdout chunk (state=3): >>>r_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2911, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 620, "free": 2911}, "nocache": {"free": 3269, "used": 262}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 695, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785595904, "block_size": 4096, "block_total": 65519099, "block_available": 63912499, "block_used": 1606600, "inode_total": 131070960, "inode_available": 131027260, "inode_used": 43700, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "04", "epoch": "1727204104", "epoch_int": "1727204104", "date": "2024-09-24", "time": "14:55:04", "iso8601_micro": "2024-09-24T18:55:04.393406Z", "iso8601": "2024-09-24T18:55:04Z", "iso8601_basic": "20240924T145504393406", "iso8601_basic_short": "20240924T145504", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 34589 1727204104.46334: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204104.46390: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 34589 1727204104.46423: stderr chunk (state=3): >>><<< 34589 1727204104.46445: stdout chunk (state=3): >>><<< 34589 1727204104.46682: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.7314453125, "5m": 0.54052734375, "15m": 0.28271484375}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2911, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 620, "free": 2911}, "nocache": {"free": 3269, "used": 262}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 695, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785595904, "block_size": 4096, "block_total": 65519099, "block_available": 63912499, "block_used": 1606600, "inode_total": 131070960, "inode_available": 131027260, "inode_used": 43700, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "04", "epoch": "1727204104", "epoch_int": "1727204104", "date": "2024-09-24", "time": "14:55:04", "iso8601_micro": "2024-09-24T18:55:04.393406Z", "iso8601": "2024-09-24T18:55:04Z", "iso8601_basic": "20240924T145504393406", "iso8601_basic_short": "20240924T145504", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204104.46847: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204103.6381688-35295-181615325217309/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204104.46879: _low_level_execute_command(): starting 34589 1727204104.46890: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204103.6381688-35295-181615325217309/ > /dev/null 2>&1 && sleep 0' 34589 1727204104.47589: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204104.47605: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204104.47695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204104.47726: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204104.47751: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204104.47767: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204104.47908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204104.49914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204104.50026: stderr chunk (state=3): >>><<< 34589 1727204104.50089: stdout chunk (state=3): >>><<< 34589 1727204104.50386: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204104.50390: handler run complete 34589 1727204104.50392: variable 'ansible_facts' from source: unknown 34589 1727204104.50565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204104.51582: variable 'ansible_facts' from source: unknown 34589 1727204104.51586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204104.51892: attempt loop complete, returning result 34589 1727204104.51904: _execute() done 34589 1727204104.51912: dumping result to json 34589 1727204104.51947: done dumping result, returning 34589 1727204104.51960: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [028d2410-947f-a9c6-cddc-0000000000ff] 34589 1727204104.51968: sending task result for task 028d2410-947f-a9c6-cddc-0000000000ff 34589 1727204104.52782: done sending task result for task 028d2410-947f-a9c6-cddc-0000000000ff 34589 1727204104.52786: WORKER PROCESS EXITING ok: [managed-node1] 34589 1727204104.53151: no more pending results, returning what we have 34589 1727204104.53154: results queue empty 34589 1727204104.53155: checking for any_errors_fatal 34589 1727204104.53156: done checking for any_errors_fatal 34589 1727204104.53157: checking for max_fail_percentage 34589 1727204104.53158: done checking for max_fail_percentage 34589 1727204104.53159: checking to see if all hosts have failed and the running result is not ok 34589 1727204104.53160: done checking to see if all hosts have failed 34589 1727204104.53161: getting the remaining hosts for this loop 34589 1727204104.53162: done getting the remaining hosts for this loop 34589 1727204104.53165: getting the next task for host managed-node1 34589 1727204104.53170: done getting next task for host managed-node1 34589 1727204104.53172: ^ task is: TASK: meta (flush_handlers) 34589 1727204104.53173: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204104.53179: getting variables 34589 1727204104.53181: in VariableManager get_vars() 34589 1727204104.53311: Calling all_inventory to load vars for managed-node1 34589 1727204104.53314: Calling groups_inventory to load vars for managed-node1 34589 1727204104.53316: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204104.53327: Calling all_plugins_play to load vars for managed-node1 34589 1727204104.53329: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204104.53332: Calling groups_plugins_play to load vars for managed-node1 34589 1727204104.53721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204104.54234: done with get_vars() 34589 1727204104.54245: done getting variables 34589 1727204104.54325: in VariableManager get_vars() 34589 1727204104.54338: Calling all_inventory to load vars for managed-node1 34589 1727204104.54341: Calling groups_inventory to load vars for managed-node1 34589 1727204104.54343: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204104.54348: Calling all_plugins_play to load vars for managed-node1 34589 1727204104.54350: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204104.54353: Calling groups_plugins_play to load vars for managed-node1 34589 1727204104.54708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204104.55071: done with get_vars() 34589 1727204104.55087: done queuing things up, now waiting for results queue to drain 34589 1727204104.55089: results queue empty 34589 1727204104.55090: checking for any_errors_fatal 34589 1727204104.55094: done checking for any_errors_fatal 34589 1727204104.55095: checking for max_fail_percentage 34589 1727204104.55096: done checking for max_fail_percentage 34589 1727204104.55102: checking to see if all hosts have failed and the running result is not ok 34589 1727204104.55103: done checking to see if all hosts have failed 34589 1727204104.55103: getting the remaining hosts for this loop 34589 1727204104.55104: done getting the remaining hosts for this loop 34589 1727204104.55107: getting the next task for host managed-node1 34589 1727204104.55111: done getting next task for host managed-node1 34589 1727204104.55113: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 34589 1727204104.55115: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204104.55117: getting variables 34589 1727204104.55118: in VariableManager get_vars() 34589 1727204104.55130: Calling all_inventory to load vars for managed-node1 34589 1727204104.55132: Calling groups_inventory to load vars for managed-node1 34589 1727204104.55134: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204104.55139: Calling all_plugins_play to load vars for managed-node1 34589 1727204104.55141: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204104.55259: Calling groups_plugins_play to load vars for managed-node1 34589 1727204104.55507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204104.55931: done with get_vars() 34589 1727204104.55940: done getting variables 34589 1727204104.55984: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34589 1727204104.56297: variable 'type' from source: play vars 34589 1727204104.56304: variable 'interface' from source: play vars TASK [Set type=veth and interface=ethtest0] ************************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:10 Tuesday 24 September 2024 14:55:04 -0400 (0:00:00.994) 0:00:04.699 ***** 34589 1727204104.56456: entering _queue_task() for managed-node1/set_fact 34589 1727204104.57221: worker is 1 (out of 1 available) 34589 1727204104.57232: exiting _queue_task() for managed-node1/set_fact 34589 1727204104.57242: done queuing things up, now waiting for results queue to drain 34589 1727204104.57243: waiting for pending results... 34589 1727204104.57709: running TaskExecutor() for managed-node1/TASK: Set type=veth and interface=ethtest0 34589 1727204104.57780: in run() - task 028d2410-947f-a9c6-cddc-00000000000b 34589 1727204104.57816: variable 'ansible_search_path' from source: unknown 34589 1727204104.57864: calling self._execute() 34589 1727204104.57974: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204104.58030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204104.58033: variable 'omit' from source: magic vars 34589 1727204104.58467: variable 'ansible_distribution_major_version' from source: facts 34589 1727204104.58492: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204104.58505: variable 'omit' from source: magic vars 34589 1727204104.58533: variable 'omit' from source: magic vars 34589 1727204104.58577: variable 'type' from source: play vars 34589 1727204104.58663: variable 'type' from source: play vars 34589 1727204104.58687: variable 'interface' from source: play vars 34589 1727204104.58800: variable 'interface' from source: play vars 34589 1727204104.58810: variable 'omit' from source: magic vars 34589 1727204104.58859: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204104.58912: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204104.58942: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204104.58967: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204104.58987: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204104.59031: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204104.59040: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204104.59090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204104.59208: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204104.59221: Set connection var ansible_shell_executable to /bin/sh 34589 1727204104.59249: Set connection var ansible_timeout to 10 34589 1727204104.59281: Set connection var ansible_shell_type to sh 34589 1727204104.59284: Set connection var ansible_connection to ssh 34589 1727204104.59290: Set connection var ansible_pipelining to False 34589 1727204104.59305: variable 'ansible_shell_executable' from source: unknown 34589 1727204104.59380: variable 'ansible_connection' from source: unknown 34589 1727204104.59383: variable 'ansible_module_compression' from source: unknown 34589 1727204104.59386: variable 'ansible_shell_type' from source: unknown 34589 1727204104.59388: variable 'ansible_shell_executable' from source: unknown 34589 1727204104.59391: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204104.59393: variable 'ansible_pipelining' from source: unknown 34589 1727204104.59395: variable 'ansible_timeout' from source: unknown 34589 1727204104.59397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204104.59688: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204104.59749: variable 'omit' from source: magic vars 34589 1727204104.59765: starting attempt loop 34589 1727204104.59780: running the handler 34589 1727204104.59807: handler run complete 34589 1727204104.59823: attempt loop complete, returning result 34589 1727204104.59830: _execute() done 34589 1727204104.59842: dumping result to json 34589 1727204104.59850: done dumping result, returning 34589 1727204104.59862: done running TaskExecutor() for managed-node1/TASK: Set type=veth and interface=ethtest0 [028d2410-947f-a9c6-cddc-00000000000b] 34589 1727204104.59873: sending task result for task 028d2410-947f-a9c6-cddc-00000000000b ok: [managed-node1] => { "ansible_facts": { "interface": "ethtest0", "type": "veth" }, "changed": false } 34589 1727204104.60058: no more pending results, returning what we have 34589 1727204104.60061: results queue empty 34589 1727204104.60062: checking for any_errors_fatal 34589 1727204104.60065: done checking for any_errors_fatal 34589 1727204104.60066: checking for max_fail_percentage 34589 1727204104.60068: done checking for max_fail_percentage 34589 1727204104.60069: checking to see if all hosts have failed and the running result is not ok 34589 1727204104.60070: done checking to see if all hosts have failed 34589 1727204104.60070: getting the remaining hosts for this loop 34589 1727204104.60071: done getting the remaining hosts for this loop 34589 1727204104.60078: getting the next task for host managed-node1 34589 1727204104.60084: done getting next task for host managed-node1 34589 1727204104.60087: ^ task is: TASK: Include the task 'show_interfaces.yml' 34589 1727204104.60090: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204104.60093: getting variables 34589 1727204104.60094: in VariableManager get_vars() 34589 1727204104.60133: Calling all_inventory to load vars for managed-node1 34589 1727204104.60140: Calling groups_inventory to load vars for managed-node1 34589 1727204104.60144: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204104.60155: Calling all_plugins_play to load vars for managed-node1 34589 1727204104.60157: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204104.60160: Calling groups_plugins_play to load vars for managed-node1 34589 1727204104.60682: done sending task result for task 028d2410-947f-a9c6-cddc-00000000000b 34589 1727204104.60685: WORKER PROCESS EXITING 34589 1727204104.60708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204104.60981: done with get_vars() 34589 1727204104.60993: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:14 Tuesday 24 September 2024 14:55:04 -0400 (0:00:00.046) 0:00:04.746 ***** 34589 1727204104.61119: entering _queue_task() for managed-node1/include_tasks 34589 1727204104.61623: worker is 1 (out of 1 available) 34589 1727204104.61634: exiting _queue_task() for managed-node1/include_tasks 34589 1727204104.61645: done queuing things up, now waiting for results queue to drain 34589 1727204104.61646: waiting for pending results... 34589 1727204104.61840: running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' 34589 1727204104.61944: in run() - task 028d2410-947f-a9c6-cddc-00000000000c 34589 1727204104.61968: variable 'ansible_search_path' from source: unknown 34589 1727204104.62017: calling self._execute() 34589 1727204104.62161: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204104.62164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204104.62171: variable 'omit' from source: magic vars 34589 1727204104.62942: variable 'ansible_distribution_major_version' from source: facts 34589 1727204104.63024: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204104.63118: _execute() done 34589 1727204104.63122: dumping result to json 34589 1727204104.63124: done dumping result, returning 34589 1727204104.63127: done running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' [028d2410-947f-a9c6-cddc-00000000000c] 34589 1727204104.63129: sending task result for task 028d2410-947f-a9c6-cddc-00000000000c 34589 1727204104.63206: done sending task result for task 028d2410-947f-a9c6-cddc-00000000000c 34589 1727204104.63209: WORKER PROCESS EXITING 34589 1727204104.63256: no more pending results, returning what we have 34589 1727204104.63261: in VariableManager get_vars() 34589 1727204104.63308: Calling all_inventory to load vars for managed-node1 34589 1727204104.63311: Calling groups_inventory to load vars for managed-node1 34589 1727204104.63313: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204104.63328: Calling all_plugins_play to load vars for managed-node1 34589 1727204104.63331: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204104.63334: Calling groups_plugins_play to load vars for managed-node1 34589 1727204104.63888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204104.64630: done with get_vars() 34589 1727204104.64639: variable 'ansible_search_path' from source: unknown 34589 1727204104.64656: we have included files to process 34589 1727204104.64657: generating all_blocks data 34589 1727204104.64658: done generating all_blocks data 34589 1727204104.64659: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 34589 1727204104.64660: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 34589 1727204104.64783: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 34589 1727204104.65055: in VariableManager get_vars() 34589 1727204104.65180: done with get_vars() 34589 1727204104.65293: done processing included file 34589 1727204104.65296: iterating over new_blocks loaded from include file 34589 1727204104.65297: in VariableManager get_vars() 34589 1727204104.65346: done with get_vars() 34589 1727204104.65348: filtering new block on tags 34589 1727204104.65367: done filtering new block on tags 34589 1727204104.65369: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node1 34589 1727204104.65377: extending task lists for all hosts with included blocks 34589 1727204104.66445: done extending task lists 34589 1727204104.66447: done processing included files 34589 1727204104.66447: results queue empty 34589 1727204104.66448: checking for any_errors_fatal 34589 1727204104.66451: done checking for any_errors_fatal 34589 1727204104.66452: checking for max_fail_percentage 34589 1727204104.66453: done checking for max_fail_percentage 34589 1727204104.66454: checking to see if all hosts have failed and the running result is not ok 34589 1727204104.66455: done checking to see if all hosts have failed 34589 1727204104.66455: getting the remaining hosts for this loop 34589 1727204104.66457: done getting the remaining hosts for this loop 34589 1727204104.66459: getting the next task for host managed-node1 34589 1727204104.66463: done getting next task for host managed-node1 34589 1727204104.66465: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 34589 1727204104.66467: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204104.66471: getting variables 34589 1727204104.66472: in VariableManager get_vars() 34589 1727204104.66491: Calling all_inventory to load vars for managed-node1 34589 1727204104.66494: Calling groups_inventory to load vars for managed-node1 34589 1727204104.66496: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204104.66503: Calling all_plugins_play to load vars for managed-node1 34589 1727204104.66508: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204104.66511: Calling groups_plugins_play to load vars for managed-node1 34589 1727204104.66704: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204104.66891: done with get_vars() 34589 1727204104.66901: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:55:04 -0400 (0:00:00.058) 0:00:04.804 ***** 34589 1727204104.66982: entering _queue_task() for managed-node1/include_tasks 34589 1727204104.67682: worker is 1 (out of 1 available) 34589 1727204104.67694: exiting _queue_task() for managed-node1/include_tasks 34589 1727204104.67711: done queuing things up, now waiting for results queue to drain 34589 1727204104.67712: waiting for pending results... 34589 1727204104.68216: running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' 34589 1727204104.68294: in run() - task 028d2410-947f-a9c6-cddc-000000000115 34589 1727204104.68299: variable 'ansible_search_path' from source: unknown 34589 1727204104.68302: variable 'ansible_search_path' from source: unknown 34589 1727204104.68305: calling self._execute() 34589 1727204104.68366: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204104.68382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204104.68402: variable 'omit' from source: magic vars 34589 1727204104.68821: variable 'ansible_distribution_major_version' from source: facts 34589 1727204104.68855: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204104.68858: _execute() done 34589 1727204104.68946: dumping result to json 34589 1727204104.68949: done dumping result, returning 34589 1727204104.68952: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' [028d2410-947f-a9c6-cddc-000000000115] 34589 1727204104.68954: sending task result for task 028d2410-947f-a9c6-cddc-000000000115 34589 1727204104.69035: done sending task result for task 028d2410-947f-a9c6-cddc-000000000115 34589 1727204104.69038: WORKER PROCESS EXITING 34589 1727204104.69071: no more pending results, returning what we have 34589 1727204104.69078: in VariableManager get_vars() 34589 1727204104.69125: Calling all_inventory to load vars for managed-node1 34589 1727204104.69128: Calling groups_inventory to load vars for managed-node1 34589 1727204104.69131: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204104.69147: Calling all_plugins_play to load vars for managed-node1 34589 1727204104.69150: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204104.69154: Calling groups_plugins_play to load vars for managed-node1 34589 1727204104.69681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204104.69954: done with get_vars() 34589 1727204104.69962: variable 'ansible_search_path' from source: unknown 34589 1727204104.69963: variable 'ansible_search_path' from source: unknown 34589 1727204104.70021: we have included files to process 34589 1727204104.70022: generating all_blocks data 34589 1727204104.70024: done generating all_blocks data 34589 1727204104.70025: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 34589 1727204104.70026: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 34589 1727204104.70028: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 34589 1727204104.70461: done processing included file 34589 1727204104.70463: iterating over new_blocks loaded from include file 34589 1727204104.70465: in VariableManager get_vars() 34589 1727204104.70488: done with get_vars() 34589 1727204104.70490: filtering new block on tags 34589 1727204104.70509: done filtering new block on tags 34589 1727204104.70511: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node1 34589 1727204104.70516: extending task lists for all hosts with included blocks 34589 1727204104.70657: done extending task lists 34589 1727204104.70658: done processing included files 34589 1727204104.70659: results queue empty 34589 1727204104.70660: checking for any_errors_fatal 34589 1727204104.70663: done checking for any_errors_fatal 34589 1727204104.70663: checking for max_fail_percentage 34589 1727204104.70664: done checking for max_fail_percentage 34589 1727204104.70665: checking to see if all hosts have failed and the running result is not ok 34589 1727204104.70666: done checking to see if all hosts have failed 34589 1727204104.70667: getting the remaining hosts for this loop 34589 1727204104.70668: done getting the remaining hosts for this loop 34589 1727204104.70671: getting the next task for host managed-node1 34589 1727204104.70675: done getting next task for host managed-node1 34589 1727204104.70678: ^ task is: TASK: Gather current interface info 34589 1727204104.70681: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204104.70684: getting variables 34589 1727204104.70685: in VariableManager get_vars() 34589 1727204104.70703: Calling all_inventory to load vars for managed-node1 34589 1727204104.70706: Calling groups_inventory to load vars for managed-node1 34589 1727204104.70708: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204104.70713: Calling all_plugins_play to load vars for managed-node1 34589 1727204104.70716: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204104.70718: Calling groups_plugins_play to load vars for managed-node1 34589 1727204104.70862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204104.71054: done with get_vars() 34589 1727204104.71063: done getting variables 34589 1727204104.71107: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:55:04 -0400 (0:00:00.041) 0:00:04.846 ***** 34589 1727204104.71141: entering _queue_task() for managed-node1/command 34589 1727204104.71677: worker is 1 (out of 1 available) 34589 1727204104.71688: exiting _queue_task() for managed-node1/command 34589 1727204104.71702: done queuing things up, now waiting for results queue to drain 34589 1727204104.71704: waiting for pending results... 34589 1727204104.71982: running TaskExecutor() for managed-node1/TASK: Gather current interface info 34589 1727204104.72123: in run() - task 028d2410-947f-a9c6-cddc-000000000192 34589 1727204104.72127: variable 'ansible_search_path' from source: unknown 34589 1727204104.72131: variable 'ansible_search_path' from source: unknown 34589 1727204104.72158: calling self._execute() 34589 1727204104.72230: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204104.72234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204104.72237: variable 'omit' from source: magic vars 34589 1727204104.72724: variable 'ansible_distribution_major_version' from source: facts 34589 1727204104.72779: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204104.72782: variable 'omit' from source: magic vars 34589 1727204104.72792: variable 'omit' from source: magic vars 34589 1727204104.72821: variable 'omit' from source: magic vars 34589 1727204104.72853: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204104.72884: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204104.72900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204104.72916: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204104.72924: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204104.72947: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204104.72950: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204104.72953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204104.73027: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204104.73030: Set connection var ansible_shell_executable to /bin/sh 34589 1727204104.73038: Set connection var ansible_timeout to 10 34589 1727204104.73040: Set connection var ansible_shell_type to sh 34589 1727204104.73046: Set connection var ansible_connection to ssh 34589 1727204104.73051: Set connection var ansible_pipelining to False 34589 1727204104.73068: variable 'ansible_shell_executable' from source: unknown 34589 1727204104.73070: variable 'ansible_connection' from source: unknown 34589 1727204104.73073: variable 'ansible_module_compression' from source: unknown 34589 1727204104.73077: variable 'ansible_shell_type' from source: unknown 34589 1727204104.73080: variable 'ansible_shell_executable' from source: unknown 34589 1727204104.73082: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204104.73091: variable 'ansible_pipelining' from source: unknown 34589 1727204104.73095: variable 'ansible_timeout' from source: unknown 34589 1727204104.73097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204104.73240: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204104.73249: variable 'omit' from source: magic vars 34589 1727204104.73254: starting attempt loop 34589 1727204104.73257: running the handler 34589 1727204104.73316: _low_level_execute_command(): starting 34589 1727204104.73322: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204104.74117: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204104.74138: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204104.74160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204104.74190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204104.74216: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204104.74230: stderr chunk (state=3): >>>debug2: match not found <<< 34589 1727204104.74293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204104.74363: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204104.74392: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204104.74423: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204104.74596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204104.76358: stdout chunk (state=3): >>>/root <<< 34589 1727204104.76452: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204104.76489: stderr chunk (state=3): >>><<< 34589 1727204104.76492: stdout chunk (state=3): >>><<< 34589 1727204104.76516: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204104.76528: _low_level_execute_command(): starting 34589 1727204104.76535: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204104.7651591-35357-203802828670321 `" && echo ansible-tmp-1727204104.7651591-35357-203802828670321="` echo /root/.ansible/tmp/ansible-tmp-1727204104.7651591-35357-203802828670321 `" ) && sleep 0' 34589 1727204104.77347: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204104.77402: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204104.77421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204104.77449: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204104.77573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204104.79718: stdout chunk (state=3): >>>ansible-tmp-1727204104.7651591-35357-203802828670321=/root/.ansible/tmp/ansible-tmp-1727204104.7651591-35357-203802828670321 <<< 34589 1727204104.79864: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204104.79867: stderr chunk (state=3): >>><<< 34589 1727204104.79870: stdout chunk (state=3): >>><<< 34589 1727204104.79881: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204104.7651591-35357-203802828670321=/root/.ansible/tmp/ansible-tmp-1727204104.7651591-35357-203802828670321 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204104.79914: variable 'ansible_module_compression' from source: unknown 34589 1727204104.79958: ANSIBALLZ: Using generic lock for ansible.legacy.command 34589 1727204104.79961: ANSIBALLZ: Acquiring lock 34589 1727204104.79964: ANSIBALLZ: Lock acquired: 140222054199088 34589 1727204104.79966: ANSIBALLZ: Creating module 34589 1727204104.90886: ANSIBALLZ: Writing module into payload 34589 1727204104.90952: ANSIBALLZ: Writing module 34589 1727204104.90993: ANSIBALLZ: Renaming module 34589 1727204104.90996: ANSIBALLZ: Done creating module 34589 1727204104.91015: variable 'ansible_facts' from source: unknown 34589 1727204104.91064: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204104.7651591-35357-203802828670321/AnsiballZ_command.py 34589 1727204104.91215: Sending initial data 34589 1727204104.91218: Sent initial data (156 bytes) 34589 1727204104.91744: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204104.91748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204104.91751: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204104.91753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204104.91755: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204104.91860: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204104.91924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204104.93717: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204104.93802: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204104.93879: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmppmiytxxw /root/.ansible/tmp/ansible-tmp-1727204104.7651591-35357-203802828670321/AnsiballZ_command.py <<< 34589 1727204104.93882: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204104.7651591-35357-203802828670321/AnsiballZ_command.py" <<< 34589 1727204104.93951: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 34589 1727204104.93954: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmppmiytxxw" to remote "/root/.ansible/tmp/ansible-tmp-1727204104.7651591-35357-203802828670321/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204104.7651591-35357-203802828670321/AnsiballZ_command.py" <<< 34589 1727204104.94669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204104.94719: stderr chunk (state=3): >>><<< 34589 1727204104.94723: stdout chunk (state=3): >>><<< 34589 1727204104.94768: done transferring module to remote 34589 1727204104.94779: _low_level_execute_command(): starting 34589 1727204104.94784: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204104.7651591-35357-203802828670321/ /root/.ansible/tmp/ansible-tmp-1727204104.7651591-35357-203802828670321/AnsiballZ_command.py && sleep 0' 34589 1727204104.95420: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204104.95424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204104.95427: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204104.95442: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204104.95496: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204104.95565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204104.97537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204104.97566: stderr chunk (state=3): >>><<< 34589 1727204104.97569: stdout chunk (state=3): >>><<< 34589 1727204104.97586: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204104.97589: _low_level_execute_command(): starting 34589 1727204104.97595: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204104.7651591-35357-203802828670321/AnsiballZ_command.py && sleep 0' 34589 1727204104.98079: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204104.98179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204104.98183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 34589 1727204104.98186: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 34589 1727204104.98188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204104.98202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204104.98305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204105.15586: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:55:05.149185", "end": "2024-09-24 14:55:05.152730", "delta": "0:00:00.003545", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34589 1727204105.17271: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204105.17335: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 34589 1727204105.17344: stdout chunk (state=3): >>><<< 34589 1727204105.17352: stderr chunk (state=3): >>><<< 34589 1727204105.17370: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:55:05.149185", "end": "2024-09-24 14:55:05.152730", "delta": "0:00:00.003545", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204105.17410: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204104.7651591-35357-203802828670321/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204105.17424: _low_level_execute_command(): starting 34589 1727204105.17434: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204104.7651591-35357-203802828670321/ > /dev/null 2>&1 && sleep 0' 34589 1727204105.18026: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204105.18042: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204105.18058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204105.18078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204105.18096: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204105.18109: stderr chunk (state=3): >>>debug2: match not found <<< 34589 1727204105.18124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204105.18144: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34589 1727204105.18196: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204105.18250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204105.18269: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204105.18295: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204105.18408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204105.20486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204105.20490: stdout chunk (state=3): >>><<< 34589 1727204105.20493: stderr chunk (state=3): >>><<< 34589 1727204105.20683: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204105.20687: handler run complete 34589 1727204105.20690: Evaluated conditional (False): False 34589 1727204105.20692: attempt loop complete, returning result 34589 1727204105.20694: _execute() done 34589 1727204105.20696: dumping result to json 34589 1727204105.20698: done dumping result, returning 34589 1727204105.20701: done running TaskExecutor() for managed-node1/TASK: Gather current interface info [028d2410-947f-a9c6-cddc-000000000192] 34589 1727204105.20702: sending task result for task 028d2410-947f-a9c6-cddc-000000000192 34589 1727204105.20774: done sending task result for task 028d2410-947f-a9c6-cddc-000000000192 34589 1727204105.20779: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003545", "end": "2024-09-24 14:55:05.152730", "rc": 0, "start": "2024-09-24 14:55:05.149185" } STDOUT: bonding_masters eth0 lo 34589 1727204105.20856: no more pending results, returning what we have 34589 1727204105.20859: results queue empty 34589 1727204105.20860: checking for any_errors_fatal 34589 1727204105.20861: done checking for any_errors_fatal 34589 1727204105.20862: checking for max_fail_percentage 34589 1727204105.20864: done checking for max_fail_percentage 34589 1727204105.20864: checking to see if all hosts have failed and the running result is not ok 34589 1727204105.20865: done checking to see if all hosts have failed 34589 1727204105.20865: getting the remaining hosts for this loop 34589 1727204105.20867: done getting the remaining hosts for this loop 34589 1727204105.20870: getting the next task for host managed-node1 34589 1727204105.20877: done getting next task for host managed-node1 34589 1727204105.20880: ^ task is: TASK: Set current_interfaces 34589 1727204105.20883: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204105.20886: getting variables 34589 1727204105.20942: in VariableManager get_vars() 34589 1727204105.20972: Calling all_inventory to load vars for managed-node1 34589 1727204105.21093: Calling groups_inventory to load vars for managed-node1 34589 1727204105.21099: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204105.21112: Calling all_plugins_play to load vars for managed-node1 34589 1727204105.21116: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204105.21119: Calling groups_plugins_play to load vars for managed-node1 34589 1727204105.21552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204105.21960: done with get_vars() 34589 1727204105.21990: done getting variables 34589 1727204105.22207: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:55:05 -0400 (0:00:00.510) 0:00:05.357 ***** 34589 1727204105.22243: entering _queue_task() for managed-node1/set_fact 34589 1727204105.23030: worker is 1 (out of 1 available) 34589 1727204105.23042: exiting _queue_task() for managed-node1/set_fact 34589 1727204105.23055: done queuing things up, now waiting for results queue to drain 34589 1727204105.23056: waiting for pending results... 34589 1727204105.23794: running TaskExecutor() for managed-node1/TASK: Set current_interfaces 34589 1727204105.23800: in run() - task 028d2410-947f-a9c6-cddc-000000000193 34589 1727204105.23803: variable 'ansible_search_path' from source: unknown 34589 1727204105.23809: variable 'ansible_search_path' from source: unknown 34589 1727204105.23812: calling self._execute() 34589 1727204105.24069: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204105.24091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204105.24096: variable 'omit' from source: magic vars 34589 1727204105.24502: variable 'ansible_distribution_major_version' from source: facts 34589 1727204105.24506: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204105.24508: variable 'omit' from source: magic vars 34589 1727204105.24546: variable 'omit' from source: magic vars 34589 1727204105.24672: variable '_current_interfaces' from source: set_fact 34589 1727204105.24748: variable 'omit' from source: magic vars 34589 1727204105.24804: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204105.24852: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204105.24935: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204105.24938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204105.24942: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204105.24959: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204105.24967: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204105.24980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204105.25100: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204105.25112: Set connection var ansible_shell_executable to /bin/sh 34589 1727204105.25127: Set connection var ansible_timeout to 10 34589 1727204105.25136: Set connection var ansible_shell_type to sh 34589 1727204105.25199: Set connection var ansible_connection to ssh 34589 1727204105.25202: Set connection var ansible_pipelining to False 34589 1727204105.25204: variable 'ansible_shell_executable' from source: unknown 34589 1727204105.25207: variable 'ansible_connection' from source: unknown 34589 1727204105.25209: variable 'ansible_module_compression' from source: unknown 34589 1727204105.25211: variable 'ansible_shell_type' from source: unknown 34589 1727204105.25213: variable 'ansible_shell_executable' from source: unknown 34589 1727204105.25221: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204105.25229: variable 'ansible_pipelining' from source: unknown 34589 1727204105.25236: variable 'ansible_timeout' from source: unknown 34589 1727204105.25245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204105.25406: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204105.25492: variable 'omit' from source: magic vars 34589 1727204105.25495: starting attempt loop 34589 1727204105.25499: running the handler 34589 1727204105.25502: handler run complete 34589 1727204105.25504: attempt loop complete, returning result 34589 1727204105.25506: _execute() done 34589 1727204105.25508: dumping result to json 34589 1727204105.25510: done dumping result, returning 34589 1727204105.25513: done running TaskExecutor() for managed-node1/TASK: Set current_interfaces [028d2410-947f-a9c6-cddc-000000000193] 34589 1727204105.25599: sending task result for task 028d2410-947f-a9c6-cddc-000000000193 34589 1727204105.25924: done sending task result for task 028d2410-947f-a9c6-cddc-000000000193 34589 1727204105.25929: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 34589 1727204105.26149: no more pending results, returning what we have 34589 1727204105.26152: results queue empty 34589 1727204105.26152: checking for any_errors_fatal 34589 1727204105.26162: done checking for any_errors_fatal 34589 1727204105.26162: checking for max_fail_percentage 34589 1727204105.26164: done checking for max_fail_percentage 34589 1727204105.26165: checking to see if all hosts have failed and the running result is not ok 34589 1727204105.26165: done checking to see if all hosts have failed 34589 1727204105.26166: getting the remaining hosts for this loop 34589 1727204105.26167: done getting the remaining hosts for this loop 34589 1727204105.26172: getting the next task for host managed-node1 34589 1727204105.26182: done getting next task for host managed-node1 34589 1727204105.26185: ^ task is: TASK: Show current_interfaces 34589 1727204105.26187: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204105.26191: getting variables 34589 1727204105.26192: in VariableManager get_vars() 34589 1727204105.26231: Calling all_inventory to load vars for managed-node1 34589 1727204105.26233: Calling groups_inventory to load vars for managed-node1 34589 1727204105.26235: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204105.26246: Calling all_plugins_play to load vars for managed-node1 34589 1727204105.26248: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204105.26251: Calling groups_plugins_play to load vars for managed-node1 34589 1727204105.26841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204105.27070: done with get_vars() 34589 1727204105.27083: done getting variables 34589 1727204105.27182: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:55:05 -0400 (0:00:00.049) 0:00:05.407 ***** 34589 1727204105.27214: entering _queue_task() for managed-node1/debug 34589 1727204105.27216: Creating lock for debug 34589 1727204105.27718: worker is 1 (out of 1 available) 34589 1727204105.27732: exiting _queue_task() for managed-node1/debug 34589 1727204105.27744: done queuing things up, now waiting for results queue to drain 34589 1727204105.27745: waiting for pending results... 34589 1727204105.28121: running TaskExecutor() for managed-node1/TASK: Show current_interfaces 34589 1727204105.28127: in run() - task 028d2410-947f-a9c6-cddc-000000000116 34589 1727204105.28147: variable 'ansible_search_path' from source: unknown 34589 1727204105.28155: variable 'ansible_search_path' from source: unknown 34589 1727204105.28201: calling self._execute() 34589 1727204105.28305: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204105.28325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204105.28338: variable 'omit' from source: magic vars 34589 1727204105.28891: variable 'ansible_distribution_major_version' from source: facts 34589 1727204105.28928: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204105.28941: variable 'omit' from source: magic vars 34589 1727204105.29092: variable 'omit' from source: magic vars 34589 1727204105.29146: variable 'current_interfaces' from source: set_fact 34589 1727204105.29183: variable 'omit' from source: magic vars 34589 1727204105.29242: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204105.29288: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204105.29322: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204105.29344: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204105.29361: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204105.29398: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204105.29418: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204105.29427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204105.29630: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204105.29634: Set connection var ansible_shell_executable to /bin/sh 34589 1727204105.29636: Set connection var ansible_timeout to 10 34589 1727204105.29638: Set connection var ansible_shell_type to sh 34589 1727204105.29640: Set connection var ansible_connection to ssh 34589 1727204105.29642: Set connection var ansible_pipelining to False 34589 1727204105.29644: variable 'ansible_shell_executable' from source: unknown 34589 1727204105.29647: variable 'ansible_connection' from source: unknown 34589 1727204105.29649: variable 'ansible_module_compression' from source: unknown 34589 1727204105.29651: variable 'ansible_shell_type' from source: unknown 34589 1727204105.29653: variable 'ansible_shell_executable' from source: unknown 34589 1727204105.29655: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204105.29656: variable 'ansible_pipelining' from source: unknown 34589 1727204105.29658: variable 'ansible_timeout' from source: unknown 34589 1727204105.29660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204105.30184: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204105.30187: variable 'omit' from source: magic vars 34589 1727204105.30190: starting attempt loop 34589 1727204105.30192: running the handler 34589 1727204105.30194: handler run complete 34589 1727204105.30196: attempt loop complete, returning result 34589 1727204105.30198: _execute() done 34589 1727204105.30200: dumping result to json 34589 1727204105.30202: done dumping result, returning 34589 1727204105.30204: done running TaskExecutor() for managed-node1/TASK: Show current_interfaces [028d2410-947f-a9c6-cddc-000000000116] 34589 1727204105.30206: sending task result for task 028d2410-947f-a9c6-cddc-000000000116 ok: [managed-node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 34589 1727204105.30324: no more pending results, returning what we have 34589 1727204105.30327: results queue empty 34589 1727204105.30328: checking for any_errors_fatal 34589 1727204105.30334: done checking for any_errors_fatal 34589 1727204105.30335: checking for max_fail_percentage 34589 1727204105.30336: done checking for max_fail_percentage 34589 1727204105.30337: checking to see if all hosts have failed and the running result is not ok 34589 1727204105.30338: done checking to see if all hosts have failed 34589 1727204105.30338: getting the remaining hosts for this loop 34589 1727204105.30339: done getting the remaining hosts for this loop 34589 1727204105.30343: getting the next task for host managed-node1 34589 1727204105.30351: done getting next task for host managed-node1 34589 1727204105.30355: ^ task is: TASK: Include the task 'manage_test_interface.yml' 34589 1727204105.30357: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204105.30361: getting variables 34589 1727204105.30363: in VariableManager get_vars() 34589 1727204105.30402: Calling all_inventory to load vars for managed-node1 34589 1727204105.30405: Calling groups_inventory to load vars for managed-node1 34589 1727204105.30407: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204105.30419: Calling all_plugins_play to load vars for managed-node1 34589 1727204105.30422: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204105.30426: Calling groups_plugins_play to load vars for managed-node1 34589 1727204105.30741: done sending task result for task 028d2410-947f-a9c6-cddc-000000000116 34589 1727204105.30744: WORKER PROCESS EXITING 34589 1727204105.30765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204105.31291: done with get_vars() 34589 1727204105.31306: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:16 Tuesday 24 September 2024 14:55:05 -0400 (0:00:00.042) 0:00:05.450 ***** 34589 1727204105.31508: entering _queue_task() for managed-node1/include_tasks 34589 1727204105.32186: worker is 1 (out of 1 available) 34589 1727204105.32200: exiting _queue_task() for managed-node1/include_tasks 34589 1727204105.32214: done queuing things up, now waiting for results queue to drain 34589 1727204105.32216: waiting for pending results... 34589 1727204105.32730: running TaskExecutor() for managed-node1/TASK: Include the task 'manage_test_interface.yml' 34589 1727204105.32840: in run() - task 028d2410-947f-a9c6-cddc-00000000000d 34589 1727204105.32887: variable 'ansible_search_path' from source: unknown 34589 1727204105.32930: calling self._execute() 34589 1727204105.33028: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204105.33040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204105.33080: variable 'omit' from source: magic vars 34589 1727204105.33438: variable 'ansible_distribution_major_version' from source: facts 34589 1727204105.33456: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204105.33467: _execute() done 34589 1727204105.33479: dumping result to json 34589 1727204105.33538: done dumping result, returning 34589 1727204105.33541: done running TaskExecutor() for managed-node1/TASK: Include the task 'manage_test_interface.yml' [028d2410-947f-a9c6-cddc-00000000000d] 34589 1727204105.33544: sending task result for task 028d2410-947f-a9c6-cddc-00000000000d 34589 1727204105.33620: done sending task result for task 028d2410-947f-a9c6-cddc-00000000000d 34589 1727204105.33622: WORKER PROCESS EXITING 34589 1727204105.33667: no more pending results, returning what we have 34589 1727204105.33672: in VariableManager get_vars() 34589 1727204105.33718: Calling all_inventory to load vars for managed-node1 34589 1727204105.33721: Calling groups_inventory to load vars for managed-node1 34589 1727204105.33723: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204105.33736: Calling all_plugins_play to load vars for managed-node1 34589 1727204105.33739: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204105.33741: Calling groups_plugins_play to load vars for managed-node1 34589 1727204105.34239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204105.34526: done with get_vars() 34589 1727204105.34535: variable 'ansible_search_path' from source: unknown 34589 1727204105.34551: we have included files to process 34589 1727204105.34552: generating all_blocks data 34589 1727204105.34554: done generating all_blocks data 34589 1727204105.34559: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 34589 1727204105.34560: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 34589 1727204105.34563: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 34589 1727204105.35128: in VariableManager get_vars() 34589 1727204105.35151: done with get_vars() 34589 1727204105.35364: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 34589 1727204105.36194: done processing included file 34589 1727204105.36197: iterating over new_blocks loaded from include file 34589 1727204105.36198: in VariableManager get_vars() 34589 1727204105.36214: done with get_vars() 34589 1727204105.36216: filtering new block on tags 34589 1727204105.36254: done filtering new block on tags 34589 1727204105.36257: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node1 34589 1727204105.36262: extending task lists for all hosts with included blocks 34589 1727204105.37404: done extending task lists 34589 1727204105.37406: done processing included files 34589 1727204105.37407: results queue empty 34589 1727204105.37408: checking for any_errors_fatal 34589 1727204105.37411: done checking for any_errors_fatal 34589 1727204105.37412: checking for max_fail_percentage 34589 1727204105.37413: done checking for max_fail_percentage 34589 1727204105.37414: checking to see if all hosts have failed and the running result is not ok 34589 1727204105.37415: done checking to see if all hosts have failed 34589 1727204105.37415: getting the remaining hosts for this loop 34589 1727204105.37417: done getting the remaining hosts for this loop 34589 1727204105.37420: getting the next task for host managed-node1 34589 1727204105.37424: done getting next task for host managed-node1 34589 1727204105.37426: ^ task is: TASK: Ensure state in ["present", "absent"] 34589 1727204105.37428: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204105.37431: getting variables 34589 1727204105.37432: in VariableManager get_vars() 34589 1727204105.37456: Calling all_inventory to load vars for managed-node1 34589 1727204105.37459: Calling groups_inventory to load vars for managed-node1 34589 1727204105.37461: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204105.37468: Calling all_plugins_play to load vars for managed-node1 34589 1727204105.37470: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204105.37473: Calling groups_plugins_play to load vars for managed-node1 34589 1727204105.37658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204105.37853: done with get_vars() 34589 1727204105.37865: done getting variables 34589 1727204105.37946: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 14:55:05 -0400 (0:00:00.064) 0:00:05.514 ***** 34589 1727204105.37978: entering _queue_task() for managed-node1/fail 34589 1727204105.37980: Creating lock for fail 34589 1727204105.38502: worker is 1 (out of 1 available) 34589 1727204105.38513: exiting _queue_task() for managed-node1/fail 34589 1727204105.38523: done queuing things up, now waiting for results queue to drain 34589 1727204105.38524: waiting for pending results... 34589 1727204105.38766: running TaskExecutor() for managed-node1/TASK: Ensure state in ["present", "absent"] 34589 1727204105.38771: in run() - task 028d2410-947f-a9c6-cddc-0000000001ae 34589 1727204105.38774: variable 'ansible_search_path' from source: unknown 34589 1727204105.38778: variable 'ansible_search_path' from source: unknown 34589 1727204105.38815: calling self._execute() 34589 1727204105.38916: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204105.38928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204105.38942: variable 'omit' from source: magic vars 34589 1727204105.39349: variable 'ansible_distribution_major_version' from source: facts 34589 1727204105.39366: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204105.39529: variable 'state' from source: include params 34589 1727204105.39622: Evaluated conditional (state not in ["present", "absent"]): False 34589 1727204105.39626: when evaluation is False, skipping this task 34589 1727204105.39628: _execute() done 34589 1727204105.39630: dumping result to json 34589 1727204105.39632: done dumping result, returning 34589 1727204105.39634: done running TaskExecutor() for managed-node1/TASK: Ensure state in ["present", "absent"] [028d2410-947f-a9c6-cddc-0000000001ae] 34589 1727204105.39636: sending task result for task 028d2410-947f-a9c6-cddc-0000000001ae 34589 1727204105.39705: done sending task result for task 028d2410-947f-a9c6-cddc-0000000001ae 34589 1727204105.39709: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 34589 1727204105.39770: no more pending results, returning what we have 34589 1727204105.39774: results queue empty 34589 1727204105.39776: checking for any_errors_fatal 34589 1727204105.39778: done checking for any_errors_fatal 34589 1727204105.39779: checking for max_fail_percentage 34589 1727204105.39781: done checking for max_fail_percentage 34589 1727204105.39782: checking to see if all hosts have failed and the running result is not ok 34589 1727204105.39782: done checking to see if all hosts have failed 34589 1727204105.39783: getting the remaining hosts for this loop 34589 1727204105.39784: done getting the remaining hosts for this loop 34589 1727204105.39789: getting the next task for host managed-node1 34589 1727204105.39795: done getting next task for host managed-node1 34589 1727204105.39798: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 34589 1727204105.39802: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204105.39806: getting variables 34589 1727204105.39808: in VariableManager get_vars() 34589 1727204105.39849: Calling all_inventory to load vars for managed-node1 34589 1727204105.39852: Calling groups_inventory to load vars for managed-node1 34589 1727204105.39855: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204105.39868: Calling all_plugins_play to load vars for managed-node1 34589 1727204105.39872: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204105.39882: Calling groups_plugins_play to load vars for managed-node1 34589 1727204105.40323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204105.40627: done with get_vars() 34589 1727204105.40643: done getting variables 34589 1727204105.40704: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 14:55:05 -0400 (0:00:00.027) 0:00:05.542 ***** 34589 1727204105.40737: entering _queue_task() for managed-node1/fail 34589 1727204105.41063: worker is 1 (out of 1 available) 34589 1727204105.41083: exiting _queue_task() for managed-node1/fail 34589 1727204105.41097: done queuing things up, now waiting for results queue to drain 34589 1727204105.41100: waiting for pending results... 34589 1727204105.41371: running TaskExecutor() for managed-node1/TASK: Ensure type in ["dummy", "tap", "veth"] 34589 1727204105.41505: in run() - task 028d2410-947f-a9c6-cddc-0000000001af 34589 1727204105.41509: variable 'ansible_search_path' from source: unknown 34589 1727204105.41512: variable 'ansible_search_path' from source: unknown 34589 1727204105.41515: calling self._execute() 34589 1727204105.41582: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204105.41586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204105.41589: variable 'omit' from source: magic vars 34589 1727204105.41958: variable 'ansible_distribution_major_version' from source: facts 34589 1727204105.41962: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204105.42155: variable 'type' from source: set_fact 34589 1727204105.42159: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 34589 1727204105.42162: when evaluation is False, skipping this task 34589 1727204105.42164: _execute() done 34589 1727204105.42171: dumping result to json 34589 1727204105.42174: done dumping result, returning 34589 1727204105.42178: done running TaskExecutor() for managed-node1/TASK: Ensure type in ["dummy", "tap", "veth"] [028d2410-947f-a9c6-cddc-0000000001af] 34589 1727204105.42181: sending task result for task 028d2410-947f-a9c6-cddc-0000000001af 34589 1727204105.42238: done sending task result for task 028d2410-947f-a9c6-cddc-0000000001af 34589 1727204105.42241: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 34589 1727204105.42306: no more pending results, returning what we have 34589 1727204105.42310: results queue empty 34589 1727204105.42310: checking for any_errors_fatal 34589 1727204105.42317: done checking for any_errors_fatal 34589 1727204105.42318: checking for max_fail_percentage 34589 1727204105.42319: done checking for max_fail_percentage 34589 1727204105.42320: checking to see if all hosts have failed and the running result is not ok 34589 1727204105.42321: done checking to see if all hosts have failed 34589 1727204105.42321: getting the remaining hosts for this loop 34589 1727204105.42322: done getting the remaining hosts for this loop 34589 1727204105.42326: getting the next task for host managed-node1 34589 1727204105.42333: done getting next task for host managed-node1 34589 1727204105.42335: ^ task is: TASK: Include the task 'show_interfaces.yml' 34589 1727204105.42338: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204105.42342: getting variables 34589 1727204105.42344: in VariableManager get_vars() 34589 1727204105.42387: Calling all_inventory to load vars for managed-node1 34589 1727204105.42390: Calling groups_inventory to load vars for managed-node1 34589 1727204105.42392: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204105.42401: Calling all_plugins_play to load vars for managed-node1 34589 1727204105.42404: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204105.42406: Calling groups_plugins_play to load vars for managed-node1 34589 1727204105.42570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204105.42783: done with get_vars() 34589 1727204105.42805: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 14:55:05 -0400 (0:00:00.021) 0:00:05.564 ***** 34589 1727204105.42915: entering _queue_task() for managed-node1/include_tasks 34589 1727204105.43500: worker is 1 (out of 1 available) 34589 1727204105.43509: exiting _queue_task() for managed-node1/include_tasks 34589 1727204105.43521: done queuing things up, now waiting for results queue to drain 34589 1727204105.43522: waiting for pending results... 34589 1727204105.43656: running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' 34589 1727204105.43752: in run() - task 028d2410-947f-a9c6-cddc-0000000001b0 34589 1727204105.43755: variable 'ansible_search_path' from source: unknown 34589 1727204105.43758: variable 'ansible_search_path' from source: unknown 34589 1727204105.43761: calling self._execute() 34589 1727204105.43864: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204105.43878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204105.43896: variable 'omit' from source: magic vars 34589 1727204105.44299: variable 'ansible_distribution_major_version' from source: facts 34589 1727204105.44323: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204105.44334: _execute() done 34589 1727204105.44381: dumping result to json 34589 1727204105.44385: done dumping result, returning 34589 1727204105.44388: done running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' [028d2410-947f-a9c6-cddc-0000000001b0] 34589 1727204105.44390: sending task result for task 028d2410-947f-a9c6-cddc-0000000001b0 34589 1727204105.44646: no more pending results, returning what we have 34589 1727204105.44652: in VariableManager get_vars() 34589 1727204105.44705: Calling all_inventory to load vars for managed-node1 34589 1727204105.44709: Calling groups_inventory to load vars for managed-node1 34589 1727204105.44712: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204105.44734: Calling all_plugins_play to load vars for managed-node1 34589 1727204105.44738: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204105.44781: done sending task result for task 028d2410-947f-a9c6-cddc-0000000001b0 34589 1727204105.44785: WORKER PROCESS EXITING 34589 1727204105.44790: Calling groups_plugins_play to load vars for managed-node1 34589 1727204105.45199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204105.45404: done with get_vars() 34589 1727204105.45412: variable 'ansible_search_path' from source: unknown 34589 1727204105.45414: variable 'ansible_search_path' from source: unknown 34589 1727204105.45451: we have included files to process 34589 1727204105.45452: generating all_blocks data 34589 1727204105.45454: done generating all_blocks data 34589 1727204105.45458: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 34589 1727204105.45459: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 34589 1727204105.45461: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 34589 1727204105.45564: in VariableManager get_vars() 34589 1727204105.45592: done with get_vars() 34589 1727204105.45685: done processing included file 34589 1727204105.45686: iterating over new_blocks loaded from include file 34589 1727204105.45687: in VariableManager get_vars() 34589 1727204105.45698: done with get_vars() 34589 1727204105.45699: filtering new block on tags 34589 1727204105.45717: done filtering new block on tags 34589 1727204105.45719: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node1 34589 1727204105.45723: extending task lists for all hosts with included blocks 34589 1727204105.45968: done extending task lists 34589 1727204105.45969: done processing included files 34589 1727204105.45970: results queue empty 34589 1727204105.45970: checking for any_errors_fatal 34589 1727204105.45973: done checking for any_errors_fatal 34589 1727204105.45973: checking for max_fail_percentage 34589 1727204105.45974: done checking for max_fail_percentage 34589 1727204105.45974: checking to see if all hosts have failed and the running result is not ok 34589 1727204105.45975: done checking to see if all hosts have failed 34589 1727204105.45977: getting the remaining hosts for this loop 34589 1727204105.45978: done getting the remaining hosts for this loop 34589 1727204105.45980: getting the next task for host managed-node1 34589 1727204105.45983: done getting next task for host managed-node1 34589 1727204105.45984: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 34589 1727204105.45986: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204105.45988: getting variables 34589 1727204105.45988: in VariableManager get_vars() 34589 1727204105.45997: Calling all_inventory to load vars for managed-node1 34589 1727204105.45998: Calling groups_inventory to load vars for managed-node1 34589 1727204105.45999: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204105.46004: Calling all_plugins_play to load vars for managed-node1 34589 1727204105.46005: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204105.46009: Calling groups_plugins_play to load vars for managed-node1 34589 1727204105.46101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204105.46237: done with get_vars() 34589 1727204105.46244: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:55:05 -0400 (0:00:00.033) 0:00:05.597 ***** 34589 1727204105.46302: entering _queue_task() for managed-node1/include_tasks 34589 1727204105.46554: worker is 1 (out of 1 available) 34589 1727204105.46568: exiting _queue_task() for managed-node1/include_tasks 34589 1727204105.46580: done queuing things up, now waiting for results queue to drain 34589 1727204105.46582: waiting for pending results... 34589 1727204105.46743: running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' 34589 1727204105.46814: in run() - task 028d2410-947f-a9c6-cddc-000000000245 34589 1727204105.46826: variable 'ansible_search_path' from source: unknown 34589 1727204105.46830: variable 'ansible_search_path' from source: unknown 34589 1727204105.46857: calling self._execute() 34589 1727204105.46933: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204105.46936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204105.46939: variable 'omit' from source: magic vars 34589 1727204105.47247: variable 'ansible_distribution_major_version' from source: facts 34589 1727204105.47252: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204105.47255: _execute() done 34589 1727204105.47257: dumping result to json 34589 1727204105.47260: done dumping result, returning 34589 1727204105.47263: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' [028d2410-947f-a9c6-cddc-000000000245] 34589 1727204105.47265: sending task result for task 028d2410-947f-a9c6-cddc-000000000245 34589 1727204105.47331: done sending task result for task 028d2410-947f-a9c6-cddc-000000000245 34589 1727204105.47334: WORKER PROCESS EXITING 34589 1727204105.47375: no more pending results, returning what we have 34589 1727204105.47381: in VariableManager get_vars() 34589 1727204105.47425: Calling all_inventory to load vars for managed-node1 34589 1727204105.47428: Calling groups_inventory to load vars for managed-node1 34589 1727204105.47430: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204105.47445: Calling all_plugins_play to load vars for managed-node1 34589 1727204105.47447: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204105.47450: Calling groups_plugins_play to load vars for managed-node1 34589 1727204105.47599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204105.47719: done with get_vars() 34589 1727204105.47725: variable 'ansible_search_path' from source: unknown 34589 1727204105.47725: variable 'ansible_search_path' from source: unknown 34589 1727204105.47764: we have included files to process 34589 1727204105.47765: generating all_blocks data 34589 1727204105.47766: done generating all_blocks data 34589 1727204105.47767: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 34589 1727204105.47768: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 34589 1727204105.47771: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 34589 1727204105.47951: done processing included file 34589 1727204105.47952: iterating over new_blocks loaded from include file 34589 1727204105.47954: in VariableManager get_vars() 34589 1727204105.47974: done with get_vars() 34589 1727204105.47978: filtering new block on tags 34589 1727204105.47995: done filtering new block on tags 34589 1727204105.47998: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node1 34589 1727204105.48003: extending task lists for all hosts with included blocks 34589 1727204105.48104: done extending task lists 34589 1727204105.48105: done processing included files 34589 1727204105.48106: results queue empty 34589 1727204105.48107: checking for any_errors_fatal 34589 1727204105.48113: done checking for any_errors_fatal 34589 1727204105.48114: checking for max_fail_percentage 34589 1727204105.48115: done checking for max_fail_percentage 34589 1727204105.48116: checking to see if all hosts have failed and the running result is not ok 34589 1727204105.48117: done checking to see if all hosts have failed 34589 1727204105.48117: getting the remaining hosts for this loop 34589 1727204105.48118: done getting the remaining hosts for this loop 34589 1727204105.48121: getting the next task for host managed-node1 34589 1727204105.48126: done getting next task for host managed-node1 34589 1727204105.48128: ^ task is: TASK: Gather current interface info 34589 1727204105.48131: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204105.48133: getting variables 34589 1727204105.48134: in VariableManager get_vars() 34589 1727204105.48149: Calling all_inventory to load vars for managed-node1 34589 1727204105.48152: Calling groups_inventory to load vars for managed-node1 34589 1727204105.48153: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204105.48158: Calling all_plugins_play to load vars for managed-node1 34589 1727204105.48160: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204105.48162: Calling groups_plugins_play to load vars for managed-node1 34589 1727204105.48324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204105.48503: done with get_vars() 34589 1727204105.48513: done getting variables 34589 1727204105.48556: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:55:05 -0400 (0:00:00.022) 0:00:05.620 ***** 34589 1727204105.48589: entering _queue_task() for managed-node1/command 34589 1727204105.48901: worker is 1 (out of 1 available) 34589 1727204105.48913: exiting _queue_task() for managed-node1/command 34589 1727204105.48925: done queuing things up, now waiting for results queue to drain 34589 1727204105.48926: waiting for pending results... 34589 1727204105.49218: running TaskExecutor() for managed-node1/TASK: Gather current interface info 34589 1727204105.49303: in run() - task 028d2410-947f-a9c6-cddc-00000000027c 34589 1727204105.49312: variable 'ansible_search_path' from source: unknown 34589 1727204105.49314: variable 'ansible_search_path' from source: unknown 34589 1727204105.49352: calling self._execute() 34589 1727204105.49413: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204105.49418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204105.49428: variable 'omit' from source: magic vars 34589 1727204105.49736: variable 'ansible_distribution_major_version' from source: facts 34589 1727204105.49746: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204105.49752: variable 'omit' from source: magic vars 34589 1727204105.49791: variable 'omit' from source: magic vars 34589 1727204105.49818: variable 'omit' from source: magic vars 34589 1727204105.49852: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204105.49882: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204105.49902: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204105.49916: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204105.49925: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204105.49949: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204105.49952: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204105.49955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204105.50028: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204105.50031: Set connection var ansible_shell_executable to /bin/sh 34589 1727204105.50041: Set connection var ansible_timeout to 10 34589 1727204105.50044: Set connection var ansible_shell_type to sh 34589 1727204105.50048: Set connection var ansible_connection to ssh 34589 1727204105.50053: Set connection var ansible_pipelining to False 34589 1727204105.50071: variable 'ansible_shell_executable' from source: unknown 34589 1727204105.50074: variable 'ansible_connection' from source: unknown 34589 1727204105.50078: variable 'ansible_module_compression' from source: unknown 34589 1727204105.50081: variable 'ansible_shell_type' from source: unknown 34589 1727204105.50083: variable 'ansible_shell_executable' from source: unknown 34589 1727204105.50085: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204105.50087: variable 'ansible_pipelining' from source: unknown 34589 1727204105.50090: variable 'ansible_timeout' from source: unknown 34589 1727204105.50094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204105.50200: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204105.50212: variable 'omit' from source: magic vars 34589 1727204105.50215: starting attempt loop 34589 1727204105.50219: running the handler 34589 1727204105.50233: _low_level_execute_command(): starting 34589 1727204105.50239: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204105.50781: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204105.50785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34589 1727204105.50789: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204105.50850: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204105.50853: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204105.50859: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204105.50947: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204105.52799: stdout chunk (state=3): >>>/root <<< 34589 1727204105.52934: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204105.52938: stdout chunk (state=3): >>><<< 34589 1727204105.52941: stderr chunk (state=3): >>><<< 34589 1727204105.53073: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204105.53080: _low_level_execute_command(): starting 34589 1727204105.53084: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204105.5297165-35400-243437347605672 `" && echo ansible-tmp-1727204105.5297165-35400-243437347605672="` echo /root/.ansible/tmp/ansible-tmp-1727204105.5297165-35400-243437347605672 `" ) && sleep 0' 34589 1727204105.53717: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204105.53755: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204105.53785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204105.54199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204105.54226: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204105.54343: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204105.56572: stdout chunk (state=3): >>>ansible-tmp-1727204105.5297165-35400-243437347605672=/root/.ansible/tmp/ansible-tmp-1727204105.5297165-35400-243437347605672 <<< 34589 1727204105.56630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204105.56693: stderr chunk (state=3): >>><<< 34589 1727204105.56704: stdout chunk (state=3): >>><<< 34589 1727204105.56734: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204105.5297165-35400-243437347605672=/root/.ansible/tmp/ansible-tmp-1727204105.5297165-35400-243437347605672 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204105.56771: variable 'ansible_module_compression' from source: unknown 34589 1727204105.56840: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34589 1727204105.56889: variable 'ansible_facts' from source: unknown 34589 1727204105.56991: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204105.5297165-35400-243437347605672/AnsiballZ_command.py 34589 1727204105.57148: Sending initial data 34589 1727204105.57153: Sent initial data (156 bytes) 34589 1727204105.57907: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204105.57923: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204105.57945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204105.58052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204105.58082: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204105.58200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204105.60015: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204105.60095: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204105.60187: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmp7nv5z0jh /root/.ansible/tmp/ansible-tmp-1727204105.5297165-35400-243437347605672/AnsiballZ_command.py <<< 34589 1727204105.60198: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204105.5297165-35400-243437347605672/AnsiballZ_command.py" <<< 34589 1727204105.60263: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmp7nv5z0jh" to remote "/root/.ansible/tmp/ansible-tmp-1727204105.5297165-35400-243437347605672/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204105.5297165-35400-243437347605672/AnsiballZ_command.py" <<< 34589 1727204105.61135: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204105.61293: stderr chunk (state=3): >>><<< 34589 1727204105.61296: stdout chunk (state=3): >>><<< 34589 1727204105.61298: done transferring module to remote 34589 1727204105.61301: _low_level_execute_command(): starting 34589 1727204105.61303: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204105.5297165-35400-243437347605672/ /root/.ansible/tmp/ansible-tmp-1727204105.5297165-35400-243437347605672/AnsiballZ_command.py && sleep 0' 34589 1727204105.61861: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204105.61881: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204105.61991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204105.62015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204105.62133: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204105.64310: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204105.64315: stdout chunk (state=3): >>><<< 34589 1727204105.64319: stderr chunk (state=3): >>><<< 34589 1727204105.64322: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204105.64331: _low_level_execute_command(): starting 34589 1727204105.64334: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204105.5297165-35400-243437347605672/AnsiballZ_command.py && sleep 0' 34589 1727204105.64878: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204105.64895: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204105.64913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204105.64997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204105.65034: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204105.65052: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204105.65065: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204105.65194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204105.82242: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:55:05.816873", "end": "2024-09-24 14:55:05.820548", "delta": "0:00:00.003675", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34589 1727204105.84121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204105.84182: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 34589 1727204105.84186: stdout chunk (state=3): >>><<< 34589 1727204105.84194: stderr chunk (state=3): >>><<< 34589 1727204105.84224: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:55:05.816873", "end": "2024-09-24 14:55:05.820548", "delta": "0:00:00.003675", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204105.84260: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204105.5297165-35400-243437347605672/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204105.84380: _low_level_execute_command(): starting 34589 1727204105.84384: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204105.5297165-35400-243437347605672/ > /dev/null 2>&1 && sleep 0' 34589 1727204105.84941: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204105.84952: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204105.84965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204105.84990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204105.85005: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204105.85011: stderr chunk (state=3): >>>debug2: match not found <<< 34589 1727204105.85023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204105.85048: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34589 1727204105.85137: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204105.85168: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204105.85289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204105.87484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204105.87488: stdout chunk (state=3): >>><<< 34589 1727204105.87491: stderr chunk (state=3): >>><<< 34589 1727204105.87494: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204105.87502: handler run complete 34589 1727204105.87504: Evaluated conditional (False): False 34589 1727204105.87509: attempt loop complete, returning result 34589 1727204105.87511: _execute() done 34589 1727204105.87518: dumping result to json 34589 1727204105.87520: done dumping result, returning 34589 1727204105.87522: done running TaskExecutor() for managed-node1/TASK: Gather current interface info [028d2410-947f-a9c6-cddc-00000000027c] 34589 1727204105.87524: sending task result for task 028d2410-947f-a9c6-cddc-00000000027c 34589 1727204105.87594: done sending task result for task 028d2410-947f-a9c6-cddc-00000000027c 34589 1727204105.87597: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003675", "end": "2024-09-24 14:55:05.820548", "rc": 0, "start": "2024-09-24 14:55:05.816873" } STDOUT: bonding_masters eth0 lo 34589 1727204105.87713: no more pending results, returning what we have 34589 1727204105.87717: results queue empty 34589 1727204105.87718: checking for any_errors_fatal 34589 1727204105.87719: done checking for any_errors_fatal 34589 1727204105.87720: checking for max_fail_percentage 34589 1727204105.87721: done checking for max_fail_percentage 34589 1727204105.87722: checking to see if all hosts have failed and the running result is not ok 34589 1727204105.87723: done checking to see if all hosts have failed 34589 1727204105.87724: getting the remaining hosts for this loop 34589 1727204105.87725: done getting the remaining hosts for this loop 34589 1727204105.87731: getting the next task for host managed-node1 34589 1727204105.87857: done getting next task for host managed-node1 34589 1727204105.87861: ^ task is: TASK: Set current_interfaces 34589 1727204105.87866: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204105.87870: getting variables 34589 1727204105.87872: in VariableManager get_vars() 34589 1727204105.87913: Calling all_inventory to load vars for managed-node1 34589 1727204105.87917: Calling groups_inventory to load vars for managed-node1 34589 1727204105.87919: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204105.87932: Calling all_plugins_play to load vars for managed-node1 34589 1727204105.87935: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204105.87938: Calling groups_plugins_play to load vars for managed-node1 34589 1727204105.88537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204105.88785: done with get_vars() 34589 1727204105.88797: done getting variables 34589 1727204105.88865: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:55:05 -0400 (0:00:00.403) 0:00:06.023 ***** 34589 1727204105.88898: entering _queue_task() for managed-node1/set_fact 34589 1727204105.89215: worker is 1 (out of 1 available) 34589 1727204105.89228: exiting _queue_task() for managed-node1/set_fact 34589 1727204105.89239: done queuing things up, now waiting for results queue to drain 34589 1727204105.89240: waiting for pending results... 34589 1727204105.89513: running TaskExecutor() for managed-node1/TASK: Set current_interfaces 34589 1727204105.89682: in run() - task 028d2410-947f-a9c6-cddc-00000000027d 34589 1727204105.89686: variable 'ansible_search_path' from source: unknown 34589 1727204105.89689: variable 'ansible_search_path' from source: unknown 34589 1727204105.89692: calling self._execute() 34589 1727204105.89752: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204105.89756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204105.89767: variable 'omit' from source: magic vars 34589 1727204105.90152: variable 'ansible_distribution_major_version' from source: facts 34589 1727204105.90164: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204105.90171: variable 'omit' from source: magic vars 34589 1727204105.90223: variable 'omit' from source: magic vars 34589 1727204105.90336: variable '_current_interfaces' from source: set_fact 34589 1727204105.90408: variable 'omit' from source: magic vars 34589 1727204105.90480: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204105.90488: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204105.90509: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204105.90524: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204105.90536: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204105.90680: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204105.90684: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204105.90687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204105.90690: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204105.90692: Set connection var ansible_shell_executable to /bin/sh 34589 1727204105.90701: Set connection var ansible_timeout to 10 34589 1727204105.90704: Set connection var ansible_shell_type to sh 34589 1727204105.90712: Set connection var ansible_connection to ssh 34589 1727204105.90717: Set connection var ansible_pipelining to False 34589 1727204105.90740: variable 'ansible_shell_executable' from source: unknown 34589 1727204105.90743: variable 'ansible_connection' from source: unknown 34589 1727204105.90746: variable 'ansible_module_compression' from source: unknown 34589 1727204105.90748: variable 'ansible_shell_type' from source: unknown 34589 1727204105.90750: variable 'ansible_shell_executable' from source: unknown 34589 1727204105.90752: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204105.90755: variable 'ansible_pipelining' from source: unknown 34589 1727204105.90759: variable 'ansible_timeout' from source: unknown 34589 1727204105.90763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204105.90911: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204105.90923: variable 'omit' from source: magic vars 34589 1727204105.90929: starting attempt loop 34589 1727204105.90933: running the handler 34589 1727204105.90944: handler run complete 34589 1727204105.90953: attempt loop complete, returning result 34589 1727204105.90956: _execute() done 34589 1727204105.90958: dumping result to json 34589 1727204105.90961: done dumping result, returning 34589 1727204105.90969: done running TaskExecutor() for managed-node1/TASK: Set current_interfaces [028d2410-947f-a9c6-cddc-00000000027d] 34589 1727204105.90972: sending task result for task 028d2410-947f-a9c6-cddc-00000000027d 34589 1727204105.91058: done sending task result for task 028d2410-947f-a9c6-cddc-00000000027d 34589 1727204105.91063: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 34589 1727204105.91132: no more pending results, returning what we have 34589 1727204105.91135: results queue empty 34589 1727204105.91136: checking for any_errors_fatal 34589 1727204105.91146: done checking for any_errors_fatal 34589 1727204105.91146: checking for max_fail_percentage 34589 1727204105.91148: done checking for max_fail_percentage 34589 1727204105.91149: checking to see if all hosts have failed and the running result is not ok 34589 1727204105.91150: done checking to see if all hosts have failed 34589 1727204105.91150: getting the remaining hosts for this loop 34589 1727204105.91152: done getting the remaining hosts for this loop 34589 1727204105.91156: getting the next task for host managed-node1 34589 1727204105.91165: done getting next task for host managed-node1 34589 1727204105.91167: ^ task is: TASK: Show current_interfaces 34589 1727204105.91173: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204105.91180: getting variables 34589 1727204105.91182: in VariableManager get_vars() 34589 1727204105.91225: Calling all_inventory to load vars for managed-node1 34589 1727204105.91229: Calling groups_inventory to load vars for managed-node1 34589 1727204105.91232: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204105.91244: Calling all_plugins_play to load vars for managed-node1 34589 1727204105.91247: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204105.91251: Calling groups_plugins_play to load vars for managed-node1 34589 1727204105.91690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204105.91895: done with get_vars() 34589 1727204105.91907: done getting variables 34589 1727204105.91972: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:55:05 -0400 (0:00:00.031) 0:00:06.054 ***** 34589 1727204105.92006: entering _queue_task() for managed-node1/debug 34589 1727204105.92414: worker is 1 (out of 1 available) 34589 1727204105.92425: exiting _queue_task() for managed-node1/debug 34589 1727204105.92436: done queuing things up, now waiting for results queue to drain 34589 1727204105.92437: waiting for pending results... 34589 1727204105.92693: running TaskExecutor() for managed-node1/TASK: Show current_interfaces 34589 1727204105.92782: in run() - task 028d2410-947f-a9c6-cddc-000000000246 34589 1727204105.92787: variable 'ansible_search_path' from source: unknown 34589 1727204105.92790: variable 'ansible_search_path' from source: unknown 34589 1727204105.92792: calling self._execute() 34589 1727204105.92858: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204105.92863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204105.92873: variable 'omit' from source: magic vars 34589 1727204105.93310: variable 'ansible_distribution_major_version' from source: facts 34589 1727204105.93319: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204105.93331: variable 'omit' from source: magic vars 34589 1727204105.93384: variable 'omit' from source: magic vars 34589 1727204105.93495: variable 'current_interfaces' from source: set_fact 34589 1727204105.93580: variable 'omit' from source: magic vars 34589 1727204105.93584: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204105.93613: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204105.93632: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204105.93650: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204105.93668: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204105.93707: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204105.93711: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204105.93714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204105.93822: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204105.93827: Set connection var ansible_shell_executable to /bin/sh 34589 1727204105.93981: Set connection var ansible_timeout to 10 34589 1727204105.93983: Set connection var ansible_shell_type to sh 34589 1727204105.93986: Set connection var ansible_connection to ssh 34589 1727204105.93988: Set connection var ansible_pipelining to False 34589 1727204105.93990: variable 'ansible_shell_executable' from source: unknown 34589 1727204105.93993: variable 'ansible_connection' from source: unknown 34589 1727204105.93995: variable 'ansible_module_compression' from source: unknown 34589 1727204105.93997: variable 'ansible_shell_type' from source: unknown 34589 1727204105.93999: variable 'ansible_shell_executable' from source: unknown 34589 1727204105.94001: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204105.94003: variable 'ansible_pipelining' from source: unknown 34589 1727204105.94010: variable 'ansible_timeout' from source: unknown 34589 1727204105.94013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204105.94053: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204105.94063: variable 'omit' from source: magic vars 34589 1727204105.94068: starting attempt loop 34589 1727204105.94073: running the handler 34589 1727204105.94128: handler run complete 34589 1727204105.94141: attempt loop complete, returning result 34589 1727204105.94144: _execute() done 34589 1727204105.94147: dumping result to json 34589 1727204105.94149: done dumping result, returning 34589 1727204105.94157: done running TaskExecutor() for managed-node1/TASK: Show current_interfaces [028d2410-947f-a9c6-cddc-000000000246] 34589 1727204105.94160: sending task result for task 028d2410-947f-a9c6-cddc-000000000246 34589 1727204105.94249: done sending task result for task 028d2410-947f-a9c6-cddc-000000000246 34589 1727204105.94253: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 34589 1727204105.94306: no more pending results, returning what we have 34589 1727204105.94309: results queue empty 34589 1727204105.94310: checking for any_errors_fatal 34589 1727204105.94316: done checking for any_errors_fatal 34589 1727204105.94317: checking for max_fail_percentage 34589 1727204105.94319: done checking for max_fail_percentage 34589 1727204105.94320: checking to see if all hosts have failed and the running result is not ok 34589 1727204105.94320: done checking to see if all hosts have failed 34589 1727204105.94321: getting the remaining hosts for this loop 34589 1727204105.94322: done getting the remaining hosts for this loop 34589 1727204105.94326: getting the next task for host managed-node1 34589 1727204105.94446: done getting next task for host managed-node1 34589 1727204105.94450: ^ task is: TASK: Install iproute 34589 1727204105.94453: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204105.94458: getting variables 34589 1727204105.94460: in VariableManager get_vars() 34589 1727204105.94499: Calling all_inventory to load vars for managed-node1 34589 1727204105.94502: Calling groups_inventory to load vars for managed-node1 34589 1727204105.94504: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204105.94514: Calling all_plugins_play to load vars for managed-node1 34589 1727204105.94517: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204105.94520: Calling groups_plugins_play to load vars for managed-node1 34589 1727204105.94887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204105.95099: done with get_vars() 34589 1727204105.95110: done getting variables 34589 1727204105.95177: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 14:55:05 -0400 (0:00:00.032) 0:00:06.087 ***** 34589 1727204105.95255: entering _queue_task() for managed-node1/package 34589 1727204105.95771: worker is 1 (out of 1 available) 34589 1727204105.95895: exiting _queue_task() for managed-node1/package 34589 1727204105.95906: done queuing things up, now waiting for results queue to drain 34589 1727204105.95908: waiting for pending results... 34589 1727204105.96300: running TaskExecutor() for managed-node1/TASK: Install iproute 34589 1727204105.96308: in run() - task 028d2410-947f-a9c6-cddc-0000000001b1 34589 1727204105.96312: variable 'ansible_search_path' from source: unknown 34589 1727204105.96315: variable 'ansible_search_path' from source: unknown 34589 1727204105.96318: calling self._execute() 34589 1727204105.96344: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204105.96350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204105.96366: variable 'omit' from source: magic vars 34589 1727204105.96881: variable 'ansible_distribution_major_version' from source: facts 34589 1727204105.96885: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204105.96888: variable 'omit' from source: magic vars 34589 1727204105.96890: variable 'omit' from source: magic vars 34589 1727204105.97051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204105.99260: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204105.99344: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204105.99396: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204105.99437: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204105.99469: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204105.99580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204105.99621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204105.99695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204105.99698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204105.99721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204105.99836: variable '__network_is_ostree' from source: set_fact 34589 1727204105.99847: variable 'omit' from source: magic vars 34589 1727204105.99890: variable 'omit' from source: magic vars 34589 1727204105.99991: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204106.00131: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204106.00134: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204106.00136: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204106.00137: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204106.00139: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204106.00141: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204106.00142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204106.00215: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204106.00227: Set connection var ansible_shell_executable to /bin/sh 34589 1727204106.00246: Set connection var ansible_timeout to 10 34589 1727204106.00254: Set connection var ansible_shell_type to sh 34589 1727204106.00265: Set connection var ansible_connection to ssh 34589 1727204106.00274: Set connection var ansible_pipelining to False 34589 1727204106.00304: variable 'ansible_shell_executable' from source: unknown 34589 1727204106.00315: variable 'ansible_connection' from source: unknown 34589 1727204106.00322: variable 'ansible_module_compression' from source: unknown 34589 1727204106.00328: variable 'ansible_shell_type' from source: unknown 34589 1727204106.00335: variable 'ansible_shell_executable' from source: unknown 34589 1727204106.00343: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204106.00456: variable 'ansible_pipelining' from source: unknown 34589 1727204106.00459: variable 'ansible_timeout' from source: unknown 34589 1727204106.00462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204106.00475: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204106.00494: variable 'omit' from source: magic vars 34589 1727204106.00504: starting attempt loop 34589 1727204106.00514: running the handler 34589 1727204106.00525: variable 'ansible_facts' from source: unknown 34589 1727204106.00533: variable 'ansible_facts' from source: unknown 34589 1727204106.00575: _low_level_execute_command(): starting 34589 1727204106.00590: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204106.01468: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204106.01487: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204106.01592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204106.01634: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204106.01856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204106.03586: stdout chunk (state=3): >>>/root <<< 34589 1727204106.03690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204106.03732: stderr chunk (state=3): >>><<< 34589 1727204106.03734: stdout chunk (state=3): >>><<< 34589 1727204106.03804: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204106.03815: _low_level_execute_command(): starting 34589 1727204106.03819: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204106.0375166-35430-274588019204584 `" && echo ansible-tmp-1727204106.0375166-35430-274588019204584="` echo /root/.ansible/tmp/ansible-tmp-1727204106.0375166-35430-274588019204584 `" ) && sleep 0' 34589 1727204106.04481: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204106.04485: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204106.04542: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204106.04554: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204106.04567: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204106.04750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204106.06983: stdout chunk (state=3): >>>ansible-tmp-1727204106.0375166-35430-274588019204584=/root/.ansible/tmp/ansible-tmp-1727204106.0375166-35430-274588019204584 <<< 34589 1727204106.07002: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204106.07005: stdout chunk (state=3): >>><<< 34589 1727204106.07015: stderr chunk (state=3): >>><<< 34589 1727204106.07039: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204106.0375166-35430-274588019204584=/root/.ansible/tmp/ansible-tmp-1727204106.0375166-35430-274588019204584 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204106.07073: variable 'ansible_module_compression' from source: unknown 34589 1727204106.07145: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 34589 1727204106.07149: ANSIBALLZ: Acquiring lock 34589 1727204106.07151: ANSIBALLZ: Lock acquired: 140222054199088 34589 1727204106.07154: ANSIBALLZ: Creating module 34589 1727204106.21112: ANSIBALLZ: Writing module into payload 34589 1727204106.21251: ANSIBALLZ: Writing module 34589 1727204106.21270: ANSIBALLZ: Renaming module 34589 1727204106.21283: ANSIBALLZ: Done creating module 34589 1727204106.21300: variable 'ansible_facts' from source: unknown 34589 1727204106.21375: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204106.0375166-35430-274588019204584/AnsiballZ_dnf.py 34589 1727204106.21483: Sending initial data 34589 1727204106.21486: Sent initial data (152 bytes) 34589 1727204106.21953: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204106.21957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204106.21960: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 34589 1727204106.21962: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204106.21964: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204106.22011: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204106.22025: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204106.22120: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204106.23888: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204106.23959: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204106.24044: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmp_p3gg1tp /root/.ansible/tmp/ansible-tmp-1727204106.0375166-35430-274588019204584/AnsiballZ_dnf.py <<< 34589 1727204106.24047: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204106.0375166-35430-274588019204584/AnsiballZ_dnf.py" <<< 34589 1727204106.24119: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmp_p3gg1tp" to remote "/root/.ansible/tmp/ansible-tmp-1727204106.0375166-35430-274588019204584/AnsiballZ_dnf.py" <<< 34589 1727204106.24122: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204106.0375166-35430-274588019204584/AnsiballZ_dnf.py" <<< 34589 1727204106.24939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204106.24984: stderr chunk (state=3): >>><<< 34589 1727204106.24987: stdout chunk (state=3): >>><<< 34589 1727204106.25035: done transferring module to remote 34589 1727204106.25046: _low_level_execute_command(): starting 34589 1727204106.25051: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204106.0375166-35430-274588019204584/ /root/.ansible/tmp/ansible-tmp-1727204106.0375166-35430-274588019204584/AnsiballZ_dnf.py && sleep 0' 34589 1727204106.25529: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204106.25534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204106.25536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204106.25538: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204106.25540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204106.25542: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204106.25597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204106.25600: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204106.25680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204106.27679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204106.27707: stderr chunk (state=3): >>><<< 34589 1727204106.27710: stdout chunk (state=3): >>><<< 34589 1727204106.27731: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204106.27734: _low_level_execute_command(): starting 34589 1727204106.27739: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204106.0375166-35430-274588019204584/AnsiballZ_dnf.py && sleep 0' 34589 1727204106.28174: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204106.28211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204106.28215: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204106.28217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204106.28219: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204106.28221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204106.28223: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204106.28274: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204106.28280: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204106.28285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204106.28369: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204106.75075: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 34589 1727204106.80783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204106.80787: stdout chunk (state=3): >>><<< 34589 1727204106.80790: stderr chunk (state=3): >>><<< 34589 1727204106.80796: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204106.80843: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204106.0375166-35430-274588019204584/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204106.80848: _low_level_execute_command(): starting 34589 1727204106.80854: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204106.0375166-35430-274588019204584/ > /dev/null 2>&1 && sleep 0' 34589 1727204106.81590: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204106.81603: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204106.81614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204106.81655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 34589 1727204106.81663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204106.81741: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204106.81761: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204106.81844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204106.83925: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204106.83929: stdout chunk (state=3): >>><<< 34589 1727204106.83932: stderr chunk (state=3): >>><<< 34589 1727204106.84083: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204106.84087: handler run complete 34589 1727204106.84407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204106.84786: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204106.85036: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204106.85040: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204106.85042: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204106.85195: variable '__install_status' from source: unknown 34589 1727204106.85221: Evaluated conditional (__install_status is success): True 34589 1727204106.85241: attempt loop complete, returning result 34589 1727204106.85302: _execute() done 34589 1727204106.85311: dumping result to json 34589 1727204106.85321: done dumping result, returning 34589 1727204106.85332: done running TaskExecutor() for managed-node1/TASK: Install iproute [028d2410-947f-a9c6-cddc-0000000001b1] 34589 1727204106.85339: sending task result for task 028d2410-947f-a9c6-cddc-0000000001b1 ok: [managed-node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 34589 1727204106.85704: no more pending results, returning what we have 34589 1727204106.85707: results queue empty 34589 1727204106.85708: checking for any_errors_fatal 34589 1727204106.85713: done checking for any_errors_fatal 34589 1727204106.85714: checking for max_fail_percentage 34589 1727204106.85716: done checking for max_fail_percentage 34589 1727204106.85717: checking to see if all hosts have failed and the running result is not ok 34589 1727204106.85718: done checking to see if all hosts have failed 34589 1727204106.85719: getting the remaining hosts for this loop 34589 1727204106.85720: done getting the remaining hosts for this loop 34589 1727204106.85724: getting the next task for host managed-node1 34589 1727204106.85730: done getting next task for host managed-node1 34589 1727204106.85733: ^ task is: TASK: Create veth interface {{ interface }} 34589 1727204106.85736: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204106.85740: getting variables 34589 1727204106.85742: in VariableManager get_vars() 34589 1727204106.86085: Calling all_inventory to load vars for managed-node1 34589 1727204106.86088: Calling groups_inventory to load vars for managed-node1 34589 1727204106.86092: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204106.86103: Calling all_plugins_play to load vars for managed-node1 34589 1727204106.86106: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204106.86109: Calling groups_plugins_play to load vars for managed-node1 34589 1727204106.86714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204106.87227: done with get_vars() 34589 1727204106.87242: done getting variables 34589 1727204106.87293: done sending task result for task 028d2410-947f-a9c6-cddc-0000000001b1 34589 1727204106.87297: WORKER PROCESS EXITING 34589 1727204106.87419: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34589 1727204106.87585: variable 'interface' from source: set_fact TASK [Create veth interface ethtest0] ****************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 14:55:06 -0400 (0:00:00.924) 0:00:07.012 ***** 34589 1727204106.87733: entering _queue_task() for managed-node1/command 34589 1727204106.88367: worker is 1 (out of 1 available) 34589 1727204106.88491: exiting _queue_task() for managed-node1/command 34589 1727204106.88505: done queuing things up, now waiting for results queue to drain 34589 1727204106.88506: waiting for pending results... 34589 1727204106.88843: running TaskExecutor() for managed-node1/TASK: Create veth interface ethtest0 34589 1727204106.88925: in run() - task 028d2410-947f-a9c6-cddc-0000000001b2 34589 1727204106.88938: variable 'ansible_search_path' from source: unknown 34589 1727204106.88942: variable 'ansible_search_path' from source: unknown 34589 1727204106.89608: variable 'interface' from source: set_fact 34589 1727204106.89695: variable 'interface' from source: set_fact 34589 1727204106.89767: variable 'interface' from source: set_fact 34589 1727204106.89981: Loaded config def from plugin (lookup/items) 34589 1727204106.89988: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 34589 1727204106.90015: variable 'omit' from source: magic vars 34589 1727204106.90132: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204106.90141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204106.90150: variable 'omit' from source: magic vars 34589 1727204106.90652: variable 'ansible_distribution_major_version' from source: facts 34589 1727204106.90659: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204106.91058: variable 'type' from source: set_fact 34589 1727204106.91062: variable 'state' from source: include params 34589 1727204106.91066: variable 'interface' from source: set_fact 34589 1727204106.91071: variable 'current_interfaces' from source: set_fact 34589 1727204106.91080: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 34589 1727204106.91241: variable 'omit' from source: magic vars 34589 1727204106.91279: variable 'omit' from source: magic vars 34589 1727204106.91327: variable 'item' from source: unknown 34589 1727204106.91396: variable 'item' from source: unknown 34589 1727204106.91413: variable 'omit' from source: magic vars 34589 1727204106.91447: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204106.91476: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204106.91658: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204106.91677: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204106.91692: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204106.91751: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204106.91755: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204106.91757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204106.91856: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204106.91861: Set connection var ansible_shell_executable to /bin/sh 34589 1727204106.91870: Set connection var ansible_timeout to 10 34589 1727204106.91872: Set connection var ansible_shell_type to sh 34589 1727204106.91881: Set connection var ansible_connection to ssh 34589 1727204106.91905: Set connection var ansible_pipelining to False 34589 1727204106.91918: variable 'ansible_shell_executable' from source: unknown 34589 1727204106.91925: variable 'ansible_connection' from source: unknown 34589 1727204106.91928: variable 'ansible_module_compression' from source: unknown 34589 1727204106.91930: variable 'ansible_shell_type' from source: unknown 34589 1727204106.91933: variable 'ansible_shell_executable' from source: unknown 34589 1727204106.91935: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204106.91937: variable 'ansible_pipelining' from source: unknown 34589 1727204106.91939: variable 'ansible_timeout' from source: unknown 34589 1727204106.91941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204106.92146: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204106.92150: variable 'omit' from source: magic vars 34589 1727204106.92153: starting attempt loop 34589 1727204106.92155: running the handler 34589 1727204106.92157: _low_level_execute_command(): starting 34589 1727204106.92159: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204106.92804: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204106.92819: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204106.92830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204106.92844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204106.92856: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204106.92863: stderr chunk (state=3): >>>debug2: match not found <<< 34589 1727204106.92873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204106.92888: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34589 1727204106.92906: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 34589 1727204106.92909: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 34589 1727204106.92911: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204106.92938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204106.92942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204106.92944: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204106.92947: stderr chunk (state=3): >>>debug2: match found <<< 34589 1727204106.92984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204106.93026: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204106.93047: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204106.93058: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204106.93168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204106.95245: stdout chunk (state=3): >>>/root <<< 34589 1727204106.95248: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204106.95251: stdout chunk (state=3): >>><<< 34589 1727204106.95259: stderr chunk (state=3): >>><<< 34589 1727204106.95283: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204106.95298: _low_level_execute_command(): starting 34589 1727204106.95306: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204106.9528325-35530-262753390361232 `" && echo ansible-tmp-1727204106.9528325-35530-262753390361232="` echo /root/.ansible/tmp/ansible-tmp-1727204106.9528325-35530-262753390361232 `" ) && sleep 0' 34589 1727204106.96072: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204106.96105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204106.96116: stderr chunk (state=3): >>>debug2: match not found <<< 34589 1727204106.96119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34589 1727204106.96121: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204106.96123: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204106.96189: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204106.96206: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204106.96212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204106.96324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204106.98501: stdout chunk (state=3): >>>ansible-tmp-1727204106.9528325-35530-262753390361232=/root/.ansible/tmp/ansible-tmp-1727204106.9528325-35530-262753390361232 <<< 34589 1727204106.98725: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204106.98767: stdout chunk (state=3): >>><<< 34589 1727204106.98770: stderr chunk (state=3): >>><<< 34589 1727204106.98809: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204106.9528325-35530-262753390361232=/root/.ansible/tmp/ansible-tmp-1727204106.9528325-35530-262753390361232 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204106.98881: variable 'ansible_module_compression' from source: unknown 34589 1727204106.98906: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34589 1727204106.98950: variable 'ansible_facts' from source: unknown 34589 1727204106.99127: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204106.9528325-35530-262753390361232/AnsiballZ_command.py 34589 1727204106.99197: Sending initial data 34589 1727204106.99206: Sent initial data (156 bytes) 34589 1727204106.99800: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204106.99808: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204106.99822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204106.99836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204106.99848: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204106.99891: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204106.99944: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204106.99955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204106.99964: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204107.00067: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204107.01850: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204107.01936: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204107.02009: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmps_4v04re /root/.ansible/tmp/ansible-tmp-1727204106.9528325-35530-262753390361232/AnsiballZ_command.py <<< 34589 1727204107.02012: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204106.9528325-35530-262753390361232/AnsiballZ_command.py" <<< 34589 1727204107.02116: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmps_4v04re" to remote "/root/.ansible/tmp/ansible-tmp-1727204106.9528325-35530-262753390361232/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204106.9528325-35530-262753390361232/AnsiballZ_command.py" <<< 34589 1727204107.02996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204107.03210: stderr chunk (state=3): >>><<< 34589 1727204107.03213: stdout chunk (state=3): >>><<< 34589 1727204107.03215: done transferring module to remote 34589 1727204107.03221: _low_level_execute_command(): starting 34589 1727204107.03223: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204106.9528325-35530-262753390361232/ /root/.ansible/tmp/ansible-tmp-1727204106.9528325-35530-262753390361232/AnsiballZ_command.py && sleep 0' 34589 1727204107.03774: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204107.03795: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204107.03811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204107.03899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204107.03929: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204107.03944: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204107.03967: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204107.04084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204107.06084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204107.06097: stdout chunk (state=3): >>><<< 34589 1727204107.06114: stderr chunk (state=3): >>><<< 34589 1727204107.06215: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204107.06218: _low_level_execute_command(): starting 34589 1727204107.06222: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204106.9528325-35530-262753390361232/AnsiballZ_command.py && sleep 0' 34589 1727204107.06765: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204107.06782: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204107.06797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204107.06831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204107.06890: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204107.06933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204107.06961: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204107.06974: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204107.07095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204107.24381: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-24 14:55:07.235195", "end": "2024-09-24 14:55:07.241280", "delta": "0:00:00.006085", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34589 1727204107.27600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204107.27632: stderr chunk (state=3): >>><<< 34589 1727204107.27636: stdout chunk (state=3): >>><<< 34589 1727204107.27652: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-24 14:55:07.235195", "end": "2024-09-24 14:55:07.241280", "delta": "0:00:00.006085", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204107.27682: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest0 type veth peer name peerethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204106.9528325-35530-262753390361232/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204107.27691: _low_level_execute_command(): starting 34589 1727204107.27693: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204106.9528325-35530-262753390361232/ > /dev/null 2>&1 && sleep 0' 34589 1727204107.28146: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204107.28150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204107.28152: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 34589 1727204107.28154: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204107.28158: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204107.28213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204107.28218: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204107.28220: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204107.28309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204107.32626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204107.32652: stderr chunk (state=3): >>><<< 34589 1727204107.32657: stdout chunk (state=3): >>><<< 34589 1727204107.32673: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204107.32681: handler run complete 34589 1727204107.32698: Evaluated conditional (False): False 34589 1727204107.32709: attempt loop complete, returning result 34589 1727204107.32724: variable 'item' from source: unknown 34589 1727204107.32787: variable 'item' from source: unknown ok: [managed-node1] => (item=ip link add ethtest0 type veth peer name peerethtest0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0" ], "delta": "0:00:00.006085", "end": "2024-09-24 14:55:07.241280", "item": "ip link add ethtest0 type veth peer name peerethtest0", "rc": 0, "start": "2024-09-24 14:55:07.235195" } 34589 1727204107.32945: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204107.32947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204107.32950: variable 'omit' from source: magic vars 34589 1727204107.33023: variable 'ansible_distribution_major_version' from source: facts 34589 1727204107.33027: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204107.33144: variable 'type' from source: set_fact 34589 1727204107.33147: variable 'state' from source: include params 34589 1727204107.33150: variable 'interface' from source: set_fact 34589 1727204107.33154: variable 'current_interfaces' from source: set_fact 34589 1727204107.33160: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 34589 1727204107.33164: variable 'omit' from source: magic vars 34589 1727204107.33181: variable 'omit' from source: magic vars 34589 1727204107.33210: variable 'item' from source: unknown 34589 1727204107.33250: variable 'item' from source: unknown 34589 1727204107.33261: variable 'omit' from source: magic vars 34589 1727204107.33283: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204107.33290: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204107.33296: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204107.33308: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204107.33311: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204107.33313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204107.33359: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204107.33362: Set connection var ansible_shell_executable to /bin/sh 34589 1727204107.33369: Set connection var ansible_timeout to 10 34589 1727204107.33372: Set connection var ansible_shell_type to sh 34589 1727204107.33378: Set connection var ansible_connection to ssh 34589 1727204107.33385: Set connection var ansible_pipelining to False 34589 1727204107.33400: variable 'ansible_shell_executable' from source: unknown 34589 1727204107.33403: variable 'ansible_connection' from source: unknown 34589 1727204107.33405: variable 'ansible_module_compression' from source: unknown 34589 1727204107.33410: variable 'ansible_shell_type' from source: unknown 34589 1727204107.33413: variable 'ansible_shell_executable' from source: unknown 34589 1727204107.33415: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204107.33417: variable 'ansible_pipelining' from source: unknown 34589 1727204107.33419: variable 'ansible_timeout' from source: unknown 34589 1727204107.33421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204107.33484: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204107.33492: variable 'omit' from source: magic vars 34589 1727204107.33496: starting attempt loop 34589 1727204107.33499: running the handler 34589 1727204107.33508: _low_level_execute_command(): starting 34589 1727204107.33511: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204107.33942: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204107.33953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204107.33963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204107.33974: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204107.34036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204107.34039: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204107.34041: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204107.34128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204107.35901: stdout chunk (state=3): >>>/root <<< 34589 1727204107.35995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204107.36024: stderr chunk (state=3): >>><<< 34589 1727204107.36027: stdout chunk (state=3): >>><<< 34589 1727204107.36041: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204107.36049: _low_level_execute_command(): starting 34589 1727204107.36054: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204107.360415-35530-13657374162047 `" && echo ansible-tmp-1727204107.360415-35530-13657374162047="` echo /root/.ansible/tmp/ansible-tmp-1727204107.360415-35530-13657374162047 `" ) && sleep 0' 34589 1727204107.36472: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204107.36505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204107.36511: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204107.36513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204107.36515: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204107.36517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204107.36521: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204107.36555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204107.36568: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204107.36656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204107.38739: stdout chunk (state=3): >>>ansible-tmp-1727204107.360415-35530-13657374162047=/root/.ansible/tmp/ansible-tmp-1727204107.360415-35530-13657374162047 <<< 34589 1727204107.38848: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204107.38878: stderr chunk (state=3): >>><<< 34589 1727204107.38881: stdout chunk (state=3): >>><<< 34589 1727204107.38897: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204107.360415-35530-13657374162047=/root/.ansible/tmp/ansible-tmp-1727204107.360415-35530-13657374162047 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204107.38917: variable 'ansible_module_compression' from source: unknown 34589 1727204107.38946: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34589 1727204107.38961: variable 'ansible_facts' from source: unknown 34589 1727204107.39012: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204107.360415-35530-13657374162047/AnsiballZ_command.py 34589 1727204107.39104: Sending initial data 34589 1727204107.39107: Sent initial data (154 bytes) 34589 1727204107.39560: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204107.39563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204107.39566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204107.39568: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204107.39576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204107.39622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204107.39628: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204107.39629: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204107.39705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204107.41450: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204107.41524: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204107.41608: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmpqq5hvc8w /root/.ansible/tmp/ansible-tmp-1727204107.360415-35530-13657374162047/AnsiballZ_command.py <<< 34589 1727204107.41611: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204107.360415-35530-13657374162047/AnsiballZ_command.py" <<< 34589 1727204107.41680: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmpqq5hvc8w" to remote "/root/.ansible/tmp/ansible-tmp-1727204107.360415-35530-13657374162047/AnsiballZ_command.py" <<< 34589 1727204107.41683: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204107.360415-35530-13657374162047/AnsiballZ_command.py" <<< 34589 1727204107.42344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204107.42387: stderr chunk (state=3): >>><<< 34589 1727204107.42391: stdout chunk (state=3): >>><<< 34589 1727204107.42422: done transferring module to remote 34589 1727204107.42430: _low_level_execute_command(): starting 34589 1727204107.42435: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204107.360415-35530-13657374162047/ /root/.ansible/tmp/ansible-tmp-1727204107.360415-35530-13657374162047/AnsiballZ_command.py && sleep 0' 34589 1727204107.42862: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204107.42869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204107.42899: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204107.42902: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204107.42904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204107.42958: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204107.42965: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204107.42967: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204107.43042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204107.44987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204107.45016: stderr chunk (state=3): >>><<< 34589 1727204107.45019: stdout chunk (state=3): >>><<< 34589 1727204107.45034: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204107.45037: _low_level_execute_command(): starting 34589 1727204107.45040: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204107.360415-35530-13657374162047/AnsiballZ_command.py && sleep 0' 34589 1727204107.45486: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204107.45490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204107.45492: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204107.45494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204107.45544: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204107.45548: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204107.45552: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204107.45635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204107.62728: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-24 14:55:07.621206", "end": "2024-09-24 14:55:07.625440", "delta": "0:00:00.004234", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34589 1727204107.64513: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204107.64540: stderr chunk (state=3): >>><<< 34589 1727204107.64543: stdout chunk (state=3): >>><<< 34589 1727204107.64558: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-24 14:55:07.621206", "end": "2024-09-24 14:55:07.625440", "delta": "0:00:00.004234", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204107.64585: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204107.360415-35530-13657374162047/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204107.64590: _low_level_execute_command(): starting 34589 1727204107.64595: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204107.360415-35530-13657374162047/ > /dev/null 2>&1 && sleep 0' 34589 1727204107.65047: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204107.65050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204107.65053: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204107.65055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 34589 1727204107.65057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204107.65114: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204107.65119: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204107.65121: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204107.65200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204107.67174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204107.67205: stderr chunk (state=3): >>><<< 34589 1727204107.67212: stdout chunk (state=3): >>><<< 34589 1727204107.67227: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204107.67231: handler run complete 34589 1727204107.67247: Evaluated conditional (False): False 34589 1727204107.67255: attempt loop complete, returning result 34589 1727204107.67269: variable 'item' from source: unknown 34589 1727204107.67333: variable 'item' from source: unknown ok: [managed-node1] => (item=ip link set peerethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest0", "up" ], "delta": "0:00:00.004234", "end": "2024-09-24 14:55:07.625440", "item": "ip link set peerethtest0 up", "rc": 0, "start": "2024-09-24 14:55:07.621206" } 34589 1727204107.67460: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204107.67463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204107.67465: variable 'omit' from source: magic vars 34589 1727204107.67560: variable 'ansible_distribution_major_version' from source: facts 34589 1727204107.67564: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204107.67678: variable 'type' from source: set_fact 34589 1727204107.67682: variable 'state' from source: include params 34589 1727204107.67686: variable 'interface' from source: set_fact 34589 1727204107.67688: variable 'current_interfaces' from source: set_fact 34589 1727204107.67695: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 34589 1727204107.67698: variable 'omit' from source: magic vars 34589 1727204107.67711: variable 'omit' from source: magic vars 34589 1727204107.67735: variable 'item' from source: unknown 34589 1727204107.67780: variable 'item' from source: unknown 34589 1727204107.67791: variable 'omit' from source: magic vars 34589 1727204107.67810: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204107.67817: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204107.67820: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204107.67831: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204107.67833: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204107.67835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204107.67885: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204107.67888: Set connection var ansible_shell_executable to /bin/sh 34589 1727204107.67895: Set connection var ansible_timeout to 10 34589 1727204107.67897: Set connection var ansible_shell_type to sh 34589 1727204107.67904: Set connection var ansible_connection to ssh 34589 1727204107.67910: Set connection var ansible_pipelining to False 34589 1727204107.67923: variable 'ansible_shell_executable' from source: unknown 34589 1727204107.67926: variable 'ansible_connection' from source: unknown 34589 1727204107.67928: variable 'ansible_module_compression' from source: unknown 34589 1727204107.67930: variable 'ansible_shell_type' from source: unknown 34589 1727204107.67932: variable 'ansible_shell_executable' from source: unknown 34589 1727204107.67935: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204107.67939: variable 'ansible_pipelining' from source: unknown 34589 1727204107.67941: variable 'ansible_timeout' from source: unknown 34589 1727204107.67945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204107.68010: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204107.68014: variable 'omit' from source: magic vars 34589 1727204107.68020: starting attempt loop 34589 1727204107.68022: running the handler 34589 1727204107.68026: _low_level_execute_command(): starting 34589 1727204107.68031: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204107.68454: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204107.68492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204107.68501: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204107.68504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204107.68506: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204107.68511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 34589 1727204107.68514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204107.68711: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204107.68714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204107.68978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204107.70752: stdout chunk (state=3): >>>/root <<< 34589 1727204107.70848: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204107.70883: stderr chunk (state=3): >>><<< 34589 1727204107.70886: stdout chunk (state=3): >>><<< 34589 1727204107.70900: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204107.70908: _low_level_execute_command(): starting 34589 1727204107.70915: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204107.7089984-35530-146862472805037 `" && echo ansible-tmp-1727204107.7089984-35530-146862472805037="` echo /root/.ansible/tmp/ansible-tmp-1727204107.7089984-35530-146862472805037 `" ) && sleep 0' 34589 1727204107.71355: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204107.71359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204107.71364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204107.71377: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204107.71433: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204107.71436: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204107.71439: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204107.71523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204107.73643: stdout chunk (state=3): >>>ansible-tmp-1727204107.7089984-35530-146862472805037=/root/.ansible/tmp/ansible-tmp-1727204107.7089984-35530-146862472805037 <<< 34589 1727204107.73752: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204107.73785: stderr chunk (state=3): >>><<< 34589 1727204107.73788: stdout chunk (state=3): >>><<< 34589 1727204107.73803: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204107.7089984-35530-146862472805037=/root/.ansible/tmp/ansible-tmp-1727204107.7089984-35530-146862472805037 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204107.73826: variable 'ansible_module_compression' from source: unknown 34589 1727204107.73855: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34589 1727204107.73874: variable 'ansible_facts' from source: unknown 34589 1727204107.73921: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204107.7089984-35530-146862472805037/AnsiballZ_command.py 34589 1727204107.74015: Sending initial data 34589 1727204107.74018: Sent initial data (156 bytes) 34589 1727204107.74687: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204107.74691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204107.74703: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204107.74784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204107.74788: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204107.74894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204107.76684: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204107.76770: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204107.76845: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmp6oz7r20u /root/.ansible/tmp/ansible-tmp-1727204107.7089984-35530-146862472805037/AnsiballZ_command.py <<< 34589 1727204107.76849: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204107.7089984-35530-146862472805037/AnsiballZ_command.py" <<< 34589 1727204107.76957: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmp6oz7r20u" to remote "/root/.ansible/tmp/ansible-tmp-1727204107.7089984-35530-146862472805037/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204107.7089984-35530-146862472805037/AnsiballZ_command.py" <<< 34589 1727204107.77896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204107.77942: stderr chunk (state=3): >>><<< 34589 1727204107.77979: stdout chunk (state=3): >>><<< 34589 1727204107.78035: done transferring module to remote 34589 1727204107.78038: _low_level_execute_command(): starting 34589 1727204107.78043: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204107.7089984-35530-146862472805037/ /root/.ansible/tmp/ansible-tmp-1727204107.7089984-35530-146862472805037/AnsiballZ_command.py && sleep 0' 34589 1727204107.78712: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204107.78782: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34589 1727204107.78800: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204107.78840: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204107.78857: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204107.78891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204107.79000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204107.80985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204107.81039: stderr chunk (state=3): >>><<< 34589 1727204107.81148: stdout chunk (state=3): >>><<< 34589 1727204107.81152: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204107.81155: _low_level_execute_command(): starting 34589 1727204107.81158: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204107.7089984-35530-146862472805037/AnsiballZ_command.py && sleep 0' 34589 1727204107.81742: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204107.81758: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204107.81772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204107.81795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204107.81823: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204107.81934: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204107.81955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204107.82080: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204107.99227: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-24 14:55:07.984135", "end": "2024-09-24 14:55:07.988195", "delta": "0:00:00.004060", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34589 1727204108.00934: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204108.00938: stdout chunk (state=3): >>><<< 34589 1727204108.00941: stderr chunk (state=3): >>><<< 34589 1727204108.01180: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-24 14:55:07.984135", "end": "2024-09-24 14:55:07.988195", "delta": "0:00:00.004060", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204108.01184: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204107.7089984-35530-146862472805037/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204108.01186: _low_level_execute_command(): starting 34589 1727204108.01189: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204107.7089984-35530-146862472805037/ > /dev/null 2>&1 && sleep 0' 34589 1727204108.02174: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204108.02213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204108.02235: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204108.02255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204108.02427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204108.04517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204108.04527: stderr chunk (state=3): >>><<< 34589 1727204108.04530: stdout chunk (state=3): >>><<< 34589 1727204108.04569: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204108.04573: handler run complete 34589 1727204108.04596: Evaluated conditional (False): False 34589 1727204108.04605: attempt loop complete, returning result 34589 1727204108.04626: variable 'item' from source: unknown 34589 1727204108.05002: variable 'item' from source: unknown ok: [managed-node1] => (item=ip link set ethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest0", "up" ], "delta": "0:00:00.004060", "end": "2024-09-24 14:55:07.988195", "item": "ip link set ethtest0 up", "rc": 0, "start": "2024-09-24 14:55:07.984135" } 34589 1727204108.05381: dumping result to json 34589 1727204108.05385: done dumping result, returning 34589 1727204108.05387: done running TaskExecutor() for managed-node1/TASK: Create veth interface ethtest0 [028d2410-947f-a9c6-cddc-0000000001b2] 34589 1727204108.05390: sending task result for task 028d2410-947f-a9c6-cddc-0000000001b2 34589 1727204108.05725: no more pending results, returning what we have 34589 1727204108.05729: results queue empty 34589 1727204108.05730: checking for any_errors_fatal 34589 1727204108.05734: done checking for any_errors_fatal 34589 1727204108.05735: checking for max_fail_percentage 34589 1727204108.05736: done checking for max_fail_percentage 34589 1727204108.05737: checking to see if all hosts have failed and the running result is not ok 34589 1727204108.05738: done checking to see if all hosts have failed 34589 1727204108.05738: getting the remaining hosts for this loop 34589 1727204108.05739: done getting the remaining hosts for this loop 34589 1727204108.05743: getting the next task for host managed-node1 34589 1727204108.05749: done getting next task for host managed-node1 34589 1727204108.05751: ^ task is: TASK: Set up veth as managed by NetworkManager 34589 1727204108.05754: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204108.05764: getting variables 34589 1727204108.05766: in VariableManager get_vars() 34589 1727204108.05804: Calling all_inventory to load vars for managed-node1 34589 1727204108.05807: Calling groups_inventory to load vars for managed-node1 34589 1727204108.05810: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204108.05821: Calling all_plugins_play to load vars for managed-node1 34589 1727204108.05824: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204108.05827: Calling groups_plugins_play to load vars for managed-node1 34589 1727204108.06392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204108.07037: done with get_vars() 34589 1727204108.07050: done getting variables 34589 1727204108.07189: done sending task result for task 028d2410-947f-a9c6-cddc-0000000001b2 34589 1727204108.07193: WORKER PROCESS EXITING 34589 1727204108.07235: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 14:55:08 -0400 (0:00:01.195) 0:00:08.207 ***** 34589 1727204108.07265: entering _queue_task() for managed-node1/command 34589 1727204108.08308: worker is 1 (out of 1 available) 34589 1727204108.08321: exiting _queue_task() for managed-node1/command 34589 1727204108.08333: done queuing things up, now waiting for results queue to drain 34589 1727204108.08335: waiting for pending results... 34589 1727204108.08610: running TaskExecutor() for managed-node1/TASK: Set up veth as managed by NetworkManager 34589 1727204108.08715: in run() - task 028d2410-947f-a9c6-cddc-0000000001b3 34589 1727204108.08909: variable 'ansible_search_path' from source: unknown 34589 1727204108.08921: variable 'ansible_search_path' from source: unknown 34589 1727204108.09082: calling self._execute() 34589 1727204108.09086: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204108.09182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204108.09274: variable 'omit' from source: magic vars 34589 1727204108.10156: variable 'ansible_distribution_major_version' from source: facts 34589 1727204108.10212: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204108.10855: variable 'type' from source: set_fact 34589 1727204108.10892: variable 'state' from source: include params 34589 1727204108.11057: Evaluated conditional (type == 'veth' and state == 'present'): True 34589 1727204108.11061: variable 'omit' from source: magic vars 34589 1727204108.11137: variable 'omit' from source: magic vars 34589 1727204108.11429: variable 'interface' from source: set_fact 34589 1727204108.11757: variable 'omit' from source: magic vars 34589 1727204108.11761: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204108.11764: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204108.11766: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204108.11769: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204108.11770: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204108.11772: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204108.11774: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204108.11779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204108.12203: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204108.12206: Set connection var ansible_shell_executable to /bin/sh 34589 1727204108.12208: Set connection var ansible_timeout to 10 34589 1727204108.12210: Set connection var ansible_shell_type to sh 34589 1727204108.12213: Set connection var ansible_connection to ssh 34589 1727204108.12215: Set connection var ansible_pipelining to False 34589 1727204108.12216: variable 'ansible_shell_executable' from source: unknown 34589 1727204108.12218: variable 'ansible_connection' from source: unknown 34589 1727204108.12220: variable 'ansible_module_compression' from source: unknown 34589 1727204108.12222: variable 'ansible_shell_type' from source: unknown 34589 1727204108.12224: variable 'ansible_shell_executable' from source: unknown 34589 1727204108.12226: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204108.12228: variable 'ansible_pipelining' from source: unknown 34589 1727204108.12230: variable 'ansible_timeout' from source: unknown 34589 1727204108.12231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204108.12627: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204108.12756: variable 'omit' from source: magic vars 34589 1727204108.12917: starting attempt loop 34589 1727204108.12990: running the handler 34589 1727204108.13012: _low_level_execute_command(): starting 34589 1727204108.13026: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204108.14164: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204108.14200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204108.14222: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204108.14570: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204108.16285: stdout chunk (state=3): >>>/root <<< 34589 1727204108.16547: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204108.16551: stdout chunk (state=3): >>><<< 34589 1727204108.16553: stderr chunk (state=3): >>><<< 34589 1727204108.16574: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204108.16600: _low_level_execute_command(): starting 34589 1727204108.16825: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204108.1658556-35741-162205357580488 `" && echo ansible-tmp-1727204108.1658556-35741-162205357580488="` echo /root/.ansible/tmp/ansible-tmp-1727204108.1658556-35741-162205357580488 `" ) && sleep 0' 34589 1727204108.17792: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204108.18079: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204108.18114: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204108.18227: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204108.20332: stdout chunk (state=3): >>>ansible-tmp-1727204108.1658556-35741-162205357580488=/root/.ansible/tmp/ansible-tmp-1727204108.1658556-35741-162205357580488 <<< 34589 1727204108.20470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204108.20881: stdout chunk (state=3): >>><<< 34589 1727204108.20885: stderr chunk (state=3): >>><<< 34589 1727204108.20887: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204108.1658556-35741-162205357580488=/root/.ansible/tmp/ansible-tmp-1727204108.1658556-35741-162205357580488 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204108.20890: variable 'ansible_module_compression' from source: unknown 34589 1727204108.20892: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34589 1727204108.20894: variable 'ansible_facts' from source: unknown 34589 1727204108.20947: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204108.1658556-35741-162205357580488/AnsiballZ_command.py 34589 1727204108.21516: Sending initial data 34589 1727204108.21519: Sent initial data (156 bytes) 34589 1727204108.22455: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204108.22799: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204108.22847: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204108.22928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204108.24729: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204108.24805: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204108.24875: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmpok9vk_mr /root/.ansible/tmp/ansible-tmp-1727204108.1658556-35741-162205357580488/AnsiballZ_command.py <<< 34589 1727204108.24889: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204108.1658556-35741-162205357580488/AnsiballZ_command.py" <<< 34589 1727204108.25071: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmpok9vk_mr" to remote "/root/.ansible/tmp/ansible-tmp-1727204108.1658556-35741-162205357580488/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204108.1658556-35741-162205357580488/AnsiballZ_command.py" <<< 34589 1727204108.26582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204108.26586: stderr chunk (state=3): >>><<< 34589 1727204108.26589: stdout chunk (state=3): >>><<< 34589 1727204108.26641: done transferring module to remote 34589 1727204108.26659: _low_level_execute_command(): starting 34589 1727204108.26670: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204108.1658556-35741-162205357580488/ /root/.ansible/tmp/ansible-tmp-1727204108.1658556-35741-162205357580488/AnsiballZ_command.py && sleep 0' 34589 1727204108.28002: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204108.28020: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204108.28063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204108.28370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204108.28397: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204108.28511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204108.30498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204108.30511: stdout chunk (state=3): >>><<< 34589 1727204108.30629: stderr chunk (state=3): >>><<< 34589 1727204108.30632: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204108.30640: _low_level_execute_command(): starting 34589 1727204108.30643: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204108.1658556-35741-162205357580488/AnsiballZ_command.py && sleep 0' 34589 1727204108.32387: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204108.32681: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204108.32689: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204108.32915: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204108.32978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204108.51302: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-24 14:55:08.491413", "end": "2024-09-24 14:55:08.510331", "delta": "0:00:00.018918", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34589 1727204108.53313: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204108.53318: stdout chunk (state=3): >>><<< 34589 1727204108.53320: stderr chunk (state=3): >>><<< 34589 1727204108.53323: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-24 14:55:08.491413", "end": "2024-09-24 14:55:08.510331", "delta": "0:00:00.018918", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204108.53325: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204108.1658556-35741-162205357580488/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204108.53327: _low_level_execute_command(): starting 34589 1727204108.53329: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204108.1658556-35741-162205357580488/ > /dev/null 2>&1 && sleep 0' 34589 1727204108.54612: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204108.54699: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204108.54716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204108.55059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204108.55125: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204108.55197: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204108.55612: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204108.57766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204108.57786: stdout chunk (state=3): >>><<< 34589 1727204108.57799: stderr chunk (state=3): >>><<< 34589 1727204108.57822: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204108.57982: handler run complete 34589 1727204108.57986: Evaluated conditional (False): False 34589 1727204108.57988: attempt loop complete, returning result 34589 1727204108.57990: _execute() done 34589 1727204108.57993: dumping result to json 34589 1727204108.57996: done dumping result, returning 34589 1727204108.57998: done running TaskExecutor() for managed-node1/TASK: Set up veth as managed by NetworkManager [028d2410-947f-a9c6-cddc-0000000001b3] 34589 1727204108.58000: sending task result for task 028d2410-947f-a9c6-cddc-0000000001b3 ok: [managed-node1] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest0", "managed", "true" ], "delta": "0:00:00.018918", "end": "2024-09-24 14:55:08.510331", "rc": 0, "start": "2024-09-24 14:55:08.491413" } 34589 1727204108.58490: no more pending results, returning what we have 34589 1727204108.58494: results queue empty 34589 1727204108.58495: checking for any_errors_fatal 34589 1727204108.58507: done checking for any_errors_fatal 34589 1727204108.58508: checking for max_fail_percentage 34589 1727204108.58510: done checking for max_fail_percentage 34589 1727204108.58511: checking to see if all hosts have failed and the running result is not ok 34589 1727204108.58512: done checking to see if all hosts have failed 34589 1727204108.58513: getting the remaining hosts for this loop 34589 1727204108.58514: done getting the remaining hosts for this loop 34589 1727204108.58518: getting the next task for host managed-node1 34589 1727204108.58524: done getting next task for host managed-node1 34589 1727204108.58527: ^ task is: TASK: Delete veth interface {{ interface }} 34589 1727204108.58531: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204108.58534: getting variables 34589 1727204108.58536: in VariableManager get_vars() 34589 1727204108.58574: Calling all_inventory to load vars for managed-node1 34589 1727204108.59023: Calling groups_inventory to load vars for managed-node1 34589 1727204108.59027: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204108.59039: Calling all_plugins_play to load vars for managed-node1 34589 1727204108.59042: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204108.59046: Calling groups_plugins_play to load vars for managed-node1 34589 1727204108.59385: done sending task result for task 028d2410-947f-a9c6-cddc-0000000001b3 34589 1727204108.59393: WORKER PROCESS EXITING 34589 1727204108.59421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204108.60007: done with get_vars() 34589 1727204108.60021: done getting variables 34589 1727204108.60203: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34589 1727204108.60437: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest0] ****************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 14:55:08 -0400 (0:00:00.532) 0:00:08.739 ***** 34589 1727204108.60468: entering _queue_task() for managed-node1/command 34589 1727204108.61121: worker is 1 (out of 1 available) 34589 1727204108.61248: exiting _queue_task() for managed-node1/command 34589 1727204108.61260: done queuing things up, now waiting for results queue to drain 34589 1727204108.61261: waiting for pending results... 34589 1727204108.61805: running TaskExecutor() for managed-node1/TASK: Delete veth interface ethtest0 34589 1727204108.61858: in run() - task 028d2410-947f-a9c6-cddc-0000000001b4 34589 1727204108.61921: variable 'ansible_search_path' from source: unknown 34589 1727204108.61929: variable 'ansible_search_path' from source: unknown 34589 1727204108.62023: calling self._execute() 34589 1727204108.62228: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204108.62234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204108.62250: variable 'omit' from source: magic vars 34589 1727204108.63022: variable 'ansible_distribution_major_version' from source: facts 34589 1727204108.63037: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204108.63380: variable 'type' from source: set_fact 34589 1727204108.63645: variable 'state' from source: include params 34589 1727204108.63648: variable 'interface' from source: set_fact 34589 1727204108.63650: variable 'current_interfaces' from source: set_fact 34589 1727204108.63653: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 34589 1727204108.63655: when evaluation is False, skipping this task 34589 1727204108.63657: _execute() done 34589 1727204108.63658: dumping result to json 34589 1727204108.63660: done dumping result, returning 34589 1727204108.63662: done running TaskExecutor() for managed-node1/TASK: Delete veth interface ethtest0 [028d2410-947f-a9c6-cddc-0000000001b4] 34589 1727204108.63664: sending task result for task 028d2410-947f-a9c6-cddc-0000000001b4 34589 1727204108.63731: done sending task result for task 028d2410-947f-a9c6-cddc-0000000001b4 34589 1727204108.63735: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 34589 1727204108.63801: no more pending results, returning what we have 34589 1727204108.63805: results queue empty 34589 1727204108.63806: checking for any_errors_fatal 34589 1727204108.63816: done checking for any_errors_fatal 34589 1727204108.63817: checking for max_fail_percentage 34589 1727204108.63819: done checking for max_fail_percentage 34589 1727204108.63820: checking to see if all hosts have failed and the running result is not ok 34589 1727204108.63821: done checking to see if all hosts have failed 34589 1727204108.63821: getting the remaining hosts for this loop 34589 1727204108.63823: done getting the remaining hosts for this loop 34589 1727204108.63827: getting the next task for host managed-node1 34589 1727204108.63833: done getting next task for host managed-node1 34589 1727204108.63836: ^ task is: TASK: Create dummy interface {{ interface }} 34589 1727204108.63840: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204108.63844: getting variables 34589 1727204108.63846: in VariableManager get_vars() 34589 1727204108.63889: Calling all_inventory to load vars for managed-node1 34589 1727204108.63892: Calling groups_inventory to load vars for managed-node1 34589 1727204108.63895: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204108.63909: Calling all_plugins_play to load vars for managed-node1 34589 1727204108.63911: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204108.63914: Calling groups_plugins_play to load vars for managed-node1 34589 1727204108.64663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204108.65091: done with get_vars() 34589 1727204108.65104: done getting variables 34589 1727204108.65160: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34589 1727204108.65280: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest0] ***************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 14:55:08 -0400 (0:00:00.049) 0:00:08.789 ***** 34589 1727204108.65424: entering _queue_task() for managed-node1/command 34589 1727204108.66209: worker is 1 (out of 1 available) 34589 1727204108.66221: exiting _queue_task() for managed-node1/command 34589 1727204108.66231: done queuing things up, now waiting for results queue to drain 34589 1727204108.66232: waiting for pending results... 34589 1727204108.66582: running TaskExecutor() for managed-node1/TASK: Create dummy interface ethtest0 34589 1727204108.66927: in run() - task 028d2410-947f-a9c6-cddc-0000000001b5 34589 1727204108.66931: variable 'ansible_search_path' from source: unknown 34589 1727204108.66934: variable 'ansible_search_path' from source: unknown 34589 1727204108.66944: calling self._execute() 34589 1727204108.67217: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204108.67221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204108.67223: variable 'omit' from source: magic vars 34589 1727204108.68048: variable 'ansible_distribution_major_version' from source: facts 34589 1727204108.68183: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204108.68693: variable 'type' from source: set_fact 34589 1727204108.68900: variable 'state' from source: include params 34589 1727204108.68905: variable 'interface' from source: set_fact 34589 1727204108.68907: variable 'current_interfaces' from source: set_fact 34589 1727204108.68911: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 34589 1727204108.68913: when evaluation is False, skipping this task 34589 1727204108.68915: _execute() done 34589 1727204108.68917: dumping result to json 34589 1727204108.68920: done dumping result, returning 34589 1727204108.68922: done running TaskExecutor() for managed-node1/TASK: Create dummy interface ethtest0 [028d2410-947f-a9c6-cddc-0000000001b5] 34589 1727204108.68924: sending task result for task 028d2410-947f-a9c6-cddc-0000000001b5 34589 1727204108.68993: done sending task result for task 028d2410-947f-a9c6-cddc-0000000001b5 34589 1727204108.68997: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 34589 1727204108.69230: no more pending results, returning what we have 34589 1727204108.69234: results queue empty 34589 1727204108.69235: checking for any_errors_fatal 34589 1727204108.69241: done checking for any_errors_fatal 34589 1727204108.69241: checking for max_fail_percentage 34589 1727204108.69243: done checking for max_fail_percentage 34589 1727204108.69244: checking to see if all hosts have failed and the running result is not ok 34589 1727204108.69245: done checking to see if all hosts have failed 34589 1727204108.69245: getting the remaining hosts for this loop 34589 1727204108.69247: done getting the remaining hosts for this loop 34589 1727204108.69251: getting the next task for host managed-node1 34589 1727204108.69257: done getting next task for host managed-node1 34589 1727204108.69260: ^ task is: TASK: Delete dummy interface {{ interface }} 34589 1727204108.69263: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204108.69268: getting variables 34589 1727204108.69270: in VariableManager get_vars() 34589 1727204108.69314: Calling all_inventory to load vars for managed-node1 34589 1727204108.69317: Calling groups_inventory to load vars for managed-node1 34589 1727204108.69320: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204108.69333: Calling all_plugins_play to load vars for managed-node1 34589 1727204108.69336: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204108.69340: Calling groups_plugins_play to load vars for managed-node1 34589 1727204108.70354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204108.70563: done with get_vars() 34589 1727204108.70977: done getting variables 34589 1727204108.71039: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34589 1727204108.71155: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest0] ***************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 14:55:08 -0400 (0:00:00.061) 0:00:08.850 ***** 34589 1727204108.71588: entering _queue_task() for managed-node1/command 34589 1727204108.72607: worker is 1 (out of 1 available) 34589 1727204108.72621: exiting _queue_task() for managed-node1/command 34589 1727204108.72635: done queuing things up, now waiting for results queue to drain 34589 1727204108.72636: waiting for pending results... 34589 1727204108.73624: running TaskExecutor() for managed-node1/TASK: Delete dummy interface ethtest0 34589 1727204108.73826: in run() - task 028d2410-947f-a9c6-cddc-0000000001b6 34589 1727204108.73832: variable 'ansible_search_path' from source: unknown 34589 1727204108.73835: variable 'ansible_search_path' from source: unknown 34589 1727204108.74045: calling self._execute() 34589 1727204108.74049: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204108.74495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204108.74497: variable 'omit' from source: magic vars 34589 1727204108.75207: variable 'ansible_distribution_major_version' from source: facts 34589 1727204108.75355: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204108.76002: variable 'type' from source: set_fact 34589 1727204108.76016: variable 'state' from source: include params 34589 1727204108.76024: variable 'interface' from source: set_fact 34589 1727204108.76099: variable 'current_interfaces' from source: set_fact 34589 1727204108.76116: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 34589 1727204108.76245: when evaluation is False, skipping this task 34589 1727204108.76248: _execute() done 34589 1727204108.76250: dumping result to json 34589 1727204108.76252: done dumping result, returning 34589 1727204108.76254: done running TaskExecutor() for managed-node1/TASK: Delete dummy interface ethtest0 [028d2410-947f-a9c6-cddc-0000000001b6] 34589 1727204108.76257: sending task result for task 028d2410-947f-a9c6-cddc-0000000001b6 skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 34589 1727204108.76568: no more pending results, returning what we have 34589 1727204108.76572: results queue empty 34589 1727204108.76573: checking for any_errors_fatal 34589 1727204108.76581: done checking for any_errors_fatal 34589 1727204108.76582: checking for max_fail_percentage 34589 1727204108.76584: done checking for max_fail_percentage 34589 1727204108.76585: checking to see if all hosts have failed and the running result is not ok 34589 1727204108.76585: done checking to see if all hosts have failed 34589 1727204108.76586: getting the remaining hosts for this loop 34589 1727204108.76588: done getting the remaining hosts for this loop 34589 1727204108.76592: getting the next task for host managed-node1 34589 1727204108.76598: done getting next task for host managed-node1 34589 1727204108.76601: ^ task is: TASK: Create tap interface {{ interface }} 34589 1727204108.76605: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204108.76610: getting variables 34589 1727204108.76612: in VariableManager get_vars() 34589 1727204108.76652: Calling all_inventory to load vars for managed-node1 34589 1727204108.76656: Calling groups_inventory to load vars for managed-node1 34589 1727204108.76658: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204108.76671: Calling all_plugins_play to load vars for managed-node1 34589 1727204108.76674: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204108.77095: Calling groups_plugins_play to load vars for managed-node1 34589 1727204108.77636: done sending task result for task 028d2410-947f-a9c6-cddc-0000000001b6 34589 1727204108.77640: WORKER PROCESS EXITING 34589 1727204108.77663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204108.78417: done with get_vars() 34589 1727204108.78431: done getting variables 34589 1727204108.78496: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34589 1727204108.78969: variable 'interface' from source: set_fact TASK [Create tap interface ethtest0] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 14:55:08 -0400 (0:00:00.074) 0:00:08.925 ***** 34589 1727204108.79067: entering _queue_task() for managed-node1/command 34589 1727204108.79736: worker is 1 (out of 1 available) 34589 1727204108.79864: exiting _queue_task() for managed-node1/command 34589 1727204108.79880: done queuing things up, now waiting for results queue to drain 34589 1727204108.79881: waiting for pending results... 34589 1727204108.80382: running TaskExecutor() for managed-node1/TASK: Create tap interface ethtest0 34589 1727204108.80510: in run() - task 028d2410-947f-a9c6-cddc-0000000001b7 34589 1727204108.80538: variable 'ansible_search_path' from source: unknown 34589 1727204108.80549: variable 'ansible_search_path' from source: unknown 34589 1727204108.80598: calling self._execute() 34589 1727204108.80698: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204108.80711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204108.80725: variable 'omit' from source: magic vars 34589 1727204108.81336: variable 'ansible_distribution_major_version' from source: facts 34589 1727204108.81339: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204108.81803: variable 'type' from source: set_fact 34589 1727204108.81834: variable 'state' from source: include params 34589 1727204108.81843: variable 'interface' from source: set_fact 34589 1727204108.81989: variable 'current_interfaces' from source: set_fact 34589 1727204108.81993: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 34589 1727204108.81995: when evaluation is False, skipping this task 34589 1727204108.81998: _execute() done 34589 1727204108.82000: dumping result to json 34589 1727204108.82002: done dumping result, returning 34589 1727204108.82004: done running TaskExecutor() for managed-node1/TASK: Create tap interface ethtest0 [028d2410-947f-a9c6-cddc-0000000001b7] 34589 1727204108.82006: sending task result for task 028d2410-947f-a9c6-cddc-0000000001b7 skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 34589 1727204108.82324: no more pending results, returning what we have 34589 1727204108.82327: results queue empty 34589 1727204108.82328: checking for any_errors_fatal 34589 1727204108.82334: done checking for any_errors_fatal 34589 1727204108.82335: checking for max_fail_percentage 34589 1727204108.82337: done checking for max_fail_percentage 34589 1727204108.82338: checking to see if all hosts have failed and the running result is not ok 34589 1727204108.82338: done checking to see if all hosts have failed 34589 1727204108.82339: getting the remaining hosts for this loop 34589 1727204108.82340: done getting the remaining hosts for this loop 34589 1727204108.82344: getting the next task for host managed-node1 34589 1727204108.82350: done getting next task for host managed-node1 34589 1727204108.82352: ^ task is: TASK: Delete tap interface {{ interface }} 34589 1727204108.82356: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204108.82360: getting variables 34589 1727204108.82362: in VariableManager get_vars() 34589 1727204108.82519: Calling all_inventory to load vars for managed-node1 34589 1727204108.82523: Calling groups_inventory to load vars for managed-node1 34589 1727204108.82525: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204108.82539: Calling all_plugins_play to load vars for managed-node1 34589 1727204108.82542: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204108.82545: Calling groups_plugins_play to load vars for managed-node1 34589 1727204108.83151: done sending task result for task 028d2410-947f-a9c6-cddc-0000000001b7 34589 1727204108.83154: WORKER PROCESS EXITING 34589 1727204108.83178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204108.83386: done with get_vars() 34589 1727204108.83397: done getting variables 34589 1727204108.83451: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34589 1727204108.83579: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest0] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 14:55:08 -0400 (0:00:00.045) 0:00:08.971 ***** 34589 1727204108.83606: entering _queue_task() for managed-node1/command 34589 1727204108.83891: worker is 1 (out of 1 available) 34589 1727204108.84088: exiting _queue_task() for managed-node1/command 34589 1727204108.84099: done queuing things up, now waiting for results queue to drain 34589 1727204108.84100: waiting for pending results... 34589 1727204108.84217: running TaskExecutor() for managed-node1/TASK: Delete tap interface ethtest0 34589 1727204108.84452: in run() - task 028d2410-947f-a9c6-cddc-0000000001b8 34589 1727204108.84456: variable 'ansible_search_path' from source: unknown 34589 1727204108.84459: variable 'ansible_search_path' from source: unknown 34589 1727204108.84462: calling self._execute() 34589 1727204108.84511: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204108.84522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204108.84536: variable 'omit' from source: magic vars 34589 1727204108.85417: variable 'ansible_distribution_major_version' from source: facts 34589 1727204108.85637: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204108.85931: variable 'type' from source: set_fact 34589 1727204108.85942: variable 'state' from source: include params 34589 1727204108.85950: variable 'interface' from source: set_fact 34589 1727204108.85957: variable 'current_interfaces' from source: set_fact 34589 1727204108.85969: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 34589 1727204108.85979: when evaluation is False, skipping this task 34589 1727204108.85987: _execute() done 34589 1727204108.86001: dumping result to json 34589 1727204108.86011: done dumping result, returning 34589 1727204108.86188: done running TaskExecutor() for managed-node1/TASK: Delete tap interface ethtest0 [028d2410-947f-a9c6-cddc-0000000001b8] 34589 1727204108.86191: sending task result for task 028d2410-947f-a9c6-cddc-0000000001b8 skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 34589 1727204108.86506: no more pending results, returning what we have 34589 1727204108.86510: results queue empty 34589 1727204108.86511: checking for any_errors_fatal 34589 1727204108.86518: done checking for any_errors_fatal 34589 1727204108.86519: checking for max_fail_percentage 34589 1727204108.86521: done checking for max_fail_percentage 34589 1727204108.86522: checking to see if all hosts have failed and the running result is not ok 34589 1727204108.86522: done checking to see if all hosts have failed 34589 1727204108.86523: getting the remaining hosts for this loop 34589 1727204108.86524: done getting the remaining hosts for this loop 34589 1727204108.86528: getting the next task for host managed-node1 34589 1727204108.86536: done getting next task for host managed-node1 34589 1727204108.86540: ^ task is: TASK: Include the task 'assert_device_present.yml' 34589 1727204108.86543: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204108.86547: getting variables 34589 1727204108.86549: in VariableManager get_vars() 34589 1727204108.86710: Calling all_inventory to load vars for managed-node1 34589 1727204108.86713: Calling groups_inventory to load vars for managed-node1 34589 1727204108.86716: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204108.86731: Calling all_plugins_play to load vars for managed-node1 34589 1727204108.86734: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204108.86737: Calling groups_plugins_play to load vars for managed-node1 34589 1727204108.87164: done sending task result for task 028d2410-947f-a9c6-cddc-0000000001b8 34589 1727204108.87169: WORKER PROCESS EXITING 34589 1727204108.87194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204108.87402: done with get_vars() 34589 1727204108.87414: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:20 Tuesday 24 September 2024 14:55:08 -0400 (0:00:00.039) 0:00:09.010 ***** 34589 1727204108.87512: entering _queue_task() for managed-node1/include_tasks 34589 1727204108.87841: worker is 1 (out of 1 available) 34589 1727204108.87854: exiting _queue_task() for managed-node1/include_tasks 34589 1727204108.87866: done queuing things up, now waiting for results queue to drain 34589 1727204108.87867: waiting for pending results... 34589 1727204108.88121: running TaskExecutor() for managed-node1/TASK: Include the task 'assert_device_present.yml' 34589 1727204108.88222: in run() - task 028d2410-947f-a9c6-cddc-00000000000e 34589 1727204108.88245: variable 'ansible_search_path' from source: unknown 34589 1727204108.88286: calling self._execute() 34589 1727204108.88387: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204108.88398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204108.88413: variable 'omit' from source: magic vars 34589 1727204108.88803: variable 'ansible_distribution_major_version' from source: facts 34589 1727204108.88822: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204108.88837: _execute() done 34589 1727204108.88877: dumping result to json 34589 1727204108.88889: done dumping result, returning 34589 1727204108.88899: done running TaskExecutor() for managed-node1/TASK: Include the task 'assert_device_present.yml' [028d2410-947f-a9c6-cddc-00000000000e] 34589 1727204108.88910: sending task result for task 028d2410-947f-a9c6-cddc-00000000000e 34589 1727204108.89307: done sending task result for task 028d2410-947f-a9c6-cddc-00000000000e 34589 1727204108.89310: WORKER PROCESS EXITING 34589 1727204108.89368: no more pending results, returning what we have 34589 1727204108.89377: in VariableManager get_vars() 34589 1727204108.89424: Calling all_inventory to load vars for managed-node1 34589 1727204108.89427: Calling groups_inventory to load vars for managed-node1 34589 1727204108.89429: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204108.89561: Calling all_plugins_play to load vars for managed-node1 34589 1727204108.89566: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204108.89570: Calling groups_plugins_play to load vars for managed-node1 34589 1727204108.90711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204108.91096: done with get_vars() 34589 1727204108.91104: variable 'ansible_search_path' from source: unknown 34589 1727204108.91117: we have included files to process 34589 1727204108.91119: generating all_blocks data 34589 1727204108.91120: done generating all_blocks data 34589 1727204108.91122: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 34589 1727204108.91123: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 34589 1727204108.91126: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 34589 1727204108.91523: in VariableManager get_vars() 34589 1727204108.91543: done with get_vars() 34589 1727204108.91768: done processing included file 34589 1727204108.91770: iterating over new_blocks loaded from include file 34589 1727204108.91771: in VariableManager get_vars() 34589 1727204108.91842: done with get_vars() 34589 1727204108.91844: filtering new block on tags 34589 1727204108.91863: done filtering new block on tags 34589 1727204108.91865: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node1 34589 1727204108.91870: extending task lists for all hosts with included blocks 34589 1727204108.95283: done extending task lists 34589 1727204108.95381: done processing included files 34589 1727204108.95382: results queue empty 34589 1727204108.95383: checking for any_errors_fatal 34589 1727204108.95386: done checking for any_errors_fatal 34589 1727204108.95386: checking for max_fail_percentage 34589 1727204108.95388: done checking for max_fail_percentage 34589 1727204108.95388: checking to see if all hosts have failed and the running result is not ok 34589 1727204108.95389: done checking to see if all hosts have failed 34589 1727204108.95390: getting the remaining hosts for this loop 34589 1727204108.95391: done getting the remaining hosts for this loop 34589 1727204108.95400: getting the next task for host managed-node1 34589 1727204108.95404: done getting next task for host managed-node1 34589 1727204108.95409: ^ task is: TASK: Include the task 'get_interface_stat.yml' 34589 1727204108.95411: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204108.95414: getting variables 34589 1727204108.95415: in VariableManager get_vars() 34589 1727204108.95430: Calling all_inventory to load vars for managed-node1 34589 1727204108.95433: Calling groups_inventory to load vars for managed-node1 34589 1727204108.95435: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204108.95442: Calling all_plugins_play to load vars for managed-node1 34589 1727204108.95444: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204108.95447: Calling groups_plugins_play to load vars for managed-node1 34589 1727204108.95742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204108.96167: done with get_vars() 34589 1727204108.96225: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:55:08 -0400 (0:00:00.088) 0:00:09.098 ***** 34589 1727204108.96386: entering _queue_task() for managed-node1/include_tasks 34589 1727204108.97398: worker is 1 (out of 1 available) 34589 1727204108.97573: exiting _queue_task() for managed-node1/include_tasks 34589 1727204108.97588: done queuing things up, now waiting for results queue to drain 34589 1727204108.97589: waiting for pending results... 34589 1727204108.98074: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 34589 1727204108.98327: in run() - task 028d2410-947f-a9c6-cddc-0000000002bc 34589 1727204108.98331: variable 'ansible_search_path' from source: unknown 34589 1727204108.98334: variable 'ansible_search_path' from source: unknown 34589 1727204108.98604: calling self._execute() 34589 1727204108.98734: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204108.98738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204108.98741: variable 'omit' from source: magic vars 34589 1727204109.00075: variable 'ansible_distribution_major_version' from source: facts 34589 1727204109.00081: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204109.00085: _execute() done 34589 1727204109.00087: dumping result to json 34589 1727204109.00090: done dumping result, returning 34589 1727204109.00093: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [028d2410-947f-a9c6-cddc-0000000002bc] 34589 1727204109.00095: sending task result for task 028d2410-947f-a9c6-cddc-0000000002bc 34589 1727204109.00393: no more pending results, returning what we have 34589 1727204109.00399: in VariableManager get_vars() 34589 1727204109.00443: Calling all_inventory to load vars for managed-node1 34589 1727204109.00446: Calling groups_inventory to load vars for managed-node1 34589 1727204109.00448: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204109.00460: Calling all_plugins_play to load vars for managed-node1 34589 1727204109.00463: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204109.00465: Calling groups_plugins_play to load vars for managed-node1 34589 1727204109.01512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204109.02247: done sending task result for task 028d2410-947f-a9c6-cddc-0000000002bc 34589 1727204109.02251: WORKER PROCESS EXITING 34589 1727204109.02263: done with get_vars() 34589 1727204109.02272: variable 'ansible_search_path' from source: unknown 34589 1727204109.02273: variable 'ansible_search_path' from source: unknown 34589 1727204109.02592: we have included files to process 34589 1727204109.02594: generating all_blocks data 34589 1727204109.02595: done generating all_blocks data 34589 1727204109.02597: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 34589 1727204109.02598: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 34589 1727204109.02600: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 34589 1727204109.03145: done processing included file 34589 1727204109.03147: iterating over new_blocks loaded from include file 34589 1727204109.03149: in VariableManager get_vars() 34589 1727204109.03287: done with get_vars() 34589 1727204109.03290: filtering new block on tags 34589 1727204109.03309: done filtering new block on tags 34589 1727204109.03312: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 34589 1727204109.03317: extending task lists for all hosts with included blocks 34589 1727204109.03588: done extending task lists 34589 1727204109.03590: done processing included files 34589 1727204109.03591: results queue empty 34589 1727204109.03591: checking for any_errors_fatal 34589 1727204109.03653: done checking for any_errors_fatal 34589 1727204109.03655: checking for max_fail_percentage 34589 1727204109.03656: done checking for max_fail_percentage 34589 1727204109.03657: checking to see if all hosts have failed and the running result is not ok 34589 1727204109.03658: done checking to see if all hosts have failed 34589 1727204109.03658: getting the remaining hosts for this loop 34589 1727204109.03659: done getting the remaining hosts for this loop 34589 1727204109.03662: getting the next task for host managed-node1 34589 1727204109.03667: done getting next task for host managed-node1 34589 1727204109.03669: ^ task is: TASK: Get stat for interface {{ interface }} 34589 1727204109.03672: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204109.03677: getting variables 34589 1727204109.03678: in VariableManager get_vars() 34589 1727204109.03692: Calling all_inventory to load vars for managed-node1 34589 1727204109.03695: Calling groups_inventory to load vars for managed-node1 34589 1727204109.03697: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204109.03760: Calling all_plugins_play to load vars for managed-node1 34589 1727204109.03765: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204109.03768: Calling groups_plugins_play to load vars for managed-node1 34589 1727204109.04212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204109.04734: done with get_vars() 34589 1727204109.04745: done getting variables 34589 1727204109.05383: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:55:09 -0400 (0:00:00.090) 0:00:09.189 ***** 34589 1727204109.05417: entering _queue_task() for managed-node1/stat 34589 1727204109.06691: worker is 1 (out of 1 available) 34589 1727204109.06704: exiting _queue_task() for managed-node1/stat 34589 1727204109.06718: done queuing things up, now waiting for results queue to drain 34589 1727204109.06719: waiting for pending results... 34589 1727204109.07265: running TaskExecutor() for managed-node1/TASK: Get stat for interface ethtest0 34589 1727204109.07402: in run() - task 028d2410-947f-a9c6-cddc-000000000373 34589 1727204109.07481: variable 'ansible_search_path' from source: unknown 34589 1727204109.07572: variable 'ansible_search_path' from source: unknown 34589 1727204109.07619: calling self._execute() 34589 1727204109.07924: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204109.07940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204109.07954: variable 'omit' from source: magic vars 34589 1727204109.09097: variable 'ansible_distribution_major_version' from source: facts 34589 1727204109.09126: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204109.09357: variable 'omit' from source: magic vars 34589 1727204109.09360: variable 'omit' from source: magic vars 34589 1727204109.09501: variable 'interface' from source: set_fact 34589 1727204109.09563: variable 'omit' from source: magic vars 34589 1727204109.09869: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204109.09873: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204109.09878: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204109.09995: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204109.10082: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204109.10304: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204109.10309: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204109.10312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204109.10468: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204109.10531: Set connection var ansible_shell_executable to /bin/sh 34589 1727204109.10544: Set connection var ansible_timeout to 10 34589 1727204109.10550: Set connection var ansible_shell_type to sh 34589 1727204109.10561: Set connection var ansible_connection to ssh 34589 1727204109.10569: Set connection var ansible_pipelining to False 34589 1727204109.10599: variable 'ansible_shell_executable' from source: unknown 34589 1727204109.10739: variable 'ansible_connection' from source: unknown 34589 1727204109.10743: variable 'ansible_module_compression' from source: unknown 34589 1727204109.10745: variable 'ansible_shell_type' from source: unknown 34589 1727204109.10747: variable 'ansible_shell_executable' from source: unknown 34589 1727204109.10749: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204109.10751: variable 'ansible_pipelining' from source: unknown 34589 1727204109.10753: variable 'ansible_timeout' from source: unknown 34589 1727204109.10755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204109.11395: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34589 1727204109.11416: variable 'omit' from source: magic vars 34589 1727204109.11427: starting attempt loop 34589 1727204109.11581: running the handler 34589 1727204109.11585: _low_level_execute_command(): starting 34589 1727204109.11587: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204109.13504: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204109.13708: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204109.13724: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204109.14049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204109.15688: stdout chunk (state=3): >>>/root <<< 34589 1727204109.16086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204109.16091: stdout chunk (state=3): >>><<< 34589 1727204109.16098: stderr chunk (state=3): >>><<< 34589 1727204109.16129: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204109.16290: _low_level_execute_command(): starting 34589 1727204109.16300: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204109.1619031-35907-201927584080387 `" && echo ansible-tmp-1727204109.1619031-35907-201927584080387="` echo /root/.ansible/tmp/ansible-tmp-1727204109.1619031-35907-201927584080387 `" ) && sleep 0' 34589 1727204109.17528: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204109.17531: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204109.17534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204109.17537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204109.17565: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204109.17672: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204109.18013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204109.18336: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204109.20449: stdout chunk (state=3): >>>ansible-tmp-1727204109.1619031-35907-201927584080387=/root/.ansible/tmp/ansible-tmp-1727204109.1619031-35907-201927584080387 <<< 34589 1727204109.20696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204109.20701: stdout chunk (state=3): >>><<< 34589 1727204109.20704: stderr chunk (state=3): >>><<< 34589 1727204109.20734: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204109.1619031-35907-201927584080387=/root/.ansible/tmp/ansible-tmp-1727204109.1619031-35907-201927584080387 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204109.20883: variable 'ansible_module_compression' from source: unknown 34589 1727204109.20965: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 34589 1727204109.21036: variable 'ansible_facts' from source: unknown 34589 1727204109.21383: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204109.1619031-35907-201927584080387/AnsiballZ_stat.py 34589 1727204109.21663: Sending initial data 34589 1727204109.21722: Sent initial data (153 bytes) 34589 1727204109.23525: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204109.23541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204109.24004: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204109.24021: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204109.24116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204109.25868: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204109.25939: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204109.26026: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmp5xj6s3p9 /root/.ansible/tmp/ansible-tmp-1727204109.1619031-35907-201927584080387/AnsiballZ_stat.py <<< 34589 1727204109.26039: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204109.1619031-35907-201927584080387/AnsiballZ_stat.py" <<< 34589 1727204109.26223: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmp5xj6s3p9" to remote "/root/.ansible/tmp/ansible-tmp-1727204109.1619031-35907-201927584080387/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204109.1619031-35907-201927584080387/AnsiballZ_stat.py" <<< 34589 1727204109.28068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204109.28072: stdout chunk (state=3): >>><<< 34589 1727204109.28080: stderr chunk (state=3): >>><<< 34589 1727204109.28129: done transferring module to remote 34589 1727204109.28139: _low_level_execute_command(): starting 34589 1727204109.28144: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204109.1619031-35907-201927584080387/ /root/.ansible/tmp/ansible-tmp-1727204109.1619031-35907-201927584080387/AnsiballZ_stat.py && sleep 0' 34589 1727204109.29447: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204109.29517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204109.29529: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204109.29548: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204109.29784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204109.31715: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204109.31987: stderr chunk (state=3): >>><<< 34589 1727204109.31991: stdout chunk (state=3): >>><<< 34589 1727204109.31994: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204109.32000: _low_level_execute_command(): starting 34589 1727204109.32002: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204109.1619031-35907-201927584080387/AnsiballZ_stat.py && sleep 0' 34589 1727204109.32987: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204109.33282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204109.33286: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204109.33289: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204109.33326: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204109.33408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204109.50211: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29802, "dev": 23, "nlink": 1, "atime": 1727204107.2391276, "mtime": 1727204107.2391276, "ctime": 1727204107.2391276, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 34589 1727204109.51884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204109.51889: stdout chunk (state=3): >>><<< 34589 1727204109.51891: stderr chunk (state=3): >>><<< 34589 1727204109.52066: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29802, "dev": 23, "nlink": 1, "atime": 1727204107.2391276, "mtime": 1727204107.2391276, "ctime": 1727204107.2391276, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204109.52069: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204109.1619031-35907-201927584080387/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204109.52072: _low_level_execute_command(): starting 34589 1727204109.52074: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204109.1619031-35907-201927584080387/ > /dev/null 2>&1 && sleep 0' 34589 1727204109.53480: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204109.53534: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204109.53693: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204109.54015: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204109.56047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204109.56060: stdout chunk (state=3): >>><<< 34589 1727204109.56085: stderr chunk (state=3): >>><<< 34589 1727204109.56108: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204109.56125: handler run complete 34589 1727204109.56188: attempt loop complete, returning result 34589 1727204109.56195: _execute() done 34589 1727204109.56281: dumping result to json 34589 1727204109.56285: done dumping result, returning 34589 1727204109.56288: done running TaskExecutor() for managed-node1/TASK: Get stat for interface ethtest0 [028d2410-947f-a9c6-cddc-000000000373] 34589 1727204109.56290: sending task result for task 028d2410-947f-a9c6-cddc-000000000373 34589 1727204109.56365: done sending task result for task 028d2410-947f-a9c6-cddc-000000000373 34589 1727204109.56368: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "atime": 1727204107.2391276, "block_size": 4096, "blocks": 0, "ctime": 1727204107.2391276, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 29802, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "mode": "0777", "mtime": 1727204107.2391276, "nlink": 1, "path": "/sys/class/net/ethtest0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 34589 1727204109.56776: no more pending results, returning what we have 34589 1727204109.56792: results queue empty 34589 1727204109.56793: checking for any_errors_fatal 34589 1727204109.56794: done checking for any_errors_fatal 34589 1727204109.56795: checking for max_fail_percentage 34589 1727204109.56797: done checking for max_fail_percentage 34589 1727204109.56797: checking to see if all hosts have failed and the running result is not ok 34589 1727204109.56798: done checking to see if all hosts have failed 34589 1727204109.56799: getting the remaining hosts for this loop 34589 1727204109.56800: done getting the remaining hosts for this loop 34589 1727204109.56804: getting the next task for host managed-node1 34589 1727204109.56815: done getting next task for host managed-node1 34589 1727204109.56823: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 34589 1727204109.56837: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204109.56868: getting variables 34589 1727204109.56870: in VariableManager get_vars() 34589 1727204109.57012: Calling all_inventory to load vars for managed-node1 34589 1727204109.57015: Calling groups_inventory to load vars for managed-node1 34589 1727204109.57017: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204109.57030: Calling all_plugins_play to load vars for managed-node1 34589 1727204109.57032: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204109.57035: Calling groups_plugins_play to load vars for managed-node1 34589 1727204109.57810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204109.58396: done with get_vars() 34589 1727204109.58412: done getting variables 34589 1727204109.58523: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 34589 1727204109.58657: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest0'] *********************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:55:09 -0400 (0:00:00.532) 0:00:09.721 ***** 34589 1727204109.58701: entering _queue_task() for managed-node1/assert 34589 1727204109.58703: Creating lock for assert 34589 1727204109.59084: worker is 1 (out of 1 available) 34589 1727204109.59100: exiting _queue_task() for managed-node1/assert 34589 1727204109.59118: done queuing things up, now waiting for results queue to drain 34589 1727204109.59120: waiting for pending results... 34589 1727204109.60004: running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'ethtest0' 34589 1727204109.60015: in run() - task 028d2410-947f-a9c6-cddc-0000000002bd 34589 1727204109.60019: variable 'ansible_search_path' from source: unknown 34589 1727204109.60022: variable 'ansible_search_path' from source: unknown 34589 1727204109.60025: calling self._execute() 34589 1727204109.60057: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204109.60062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204109.60074: variable 'omit' from source: magic vars 34589 1727204109.60536: variable 'ansible_distribution_major_version' from source: facts 34589 1727204109.60542: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204109.60545: variable 'omit' from source: magic vars 34589 1727204109.60600: variable 'omit' from source: magic vars 34589 1727204109.60741: variable 'interface' from source: set_fact 34589 1727204109.60759: variable 'omit' from source: magic vars 34589 1727204109.61021: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204109.61057: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204109.61084: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204109.61095: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204109.61110: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204109.61138: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204109.61141: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204109.61144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204109.61448: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204109.61453: Set connection var ansible_shell_executable to /bin/sh 34589 1727204109.61461: Set connection var ansible_timeout to 10 34589 1727204109.61464: Set connection var ansible_shell_type to sh 34589 1727204109.61471: Set connection var ansible_connection to ssh 34589 1727204109.61478: Set connection var ansible_pipelining to False 34589 1727204109.61502: variable 'ansible_shell_executable' from source: unknown 34589 1727204109.61520: variable 'ansible_connection' from source: unknown 34589 1727204109.61523: variable 'ansible_module_compression' from source: unknown 34589 1727204109.61525: variable 'ansible_shell_type' from source: unknown 34589 1727204109.61527: variable 'ansible_shell_executable' from source: unknown 34589 1727204109.61529: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204109.61531: variable 'ansible_pipelining' from source: unknown 34589 1727204109.61533: variable 'ansible_timeout' from source: unknown 34589 1727204109.61535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204109.61921: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204109.61925: variable 'omit' from source: magic vars 34589 1727204109.61927: starting attempt loop 34589 1727204109.61930: running the handler 34589 1727204109.62257: variable 'interface_stat' from source: set_fact 34589 1727204109.62265: Evaluated conditional (interface_stat.stat.exists): True 34589 1727204109.62285: handler run complete 34589 1727204109.62384: attempt loop complete, returning result 34589 1727204109.62388: _execute() done 34589 1727204109.62390: dumping result to json 34589 1727204109.62392: done dumping result, returning 34589 1727204109.62394: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'ethtest0' [028d2410-947f-a9c6-cddc-0000000002bd] 34589 1727204109.62397: sending task result for task 028d2410-947f-a9c6-cddc-0000000002bd 34589 1727204109.62473: done sending task result for task 028d2410-947f-a9c6-cddc-0000000002bd ok: [managed-node1] => { "changed": false } MSG: All assertions passed 34589 1727204109.62535: no more pending results, returning what we have 34589 1727204109.62539: results queue empty 34589 1727204109.62540: checking for any_errors_fatal 34589 1727204109.62549: done checking for any_errors_fatal 34589 1727204109.62550: checking for max_fail_percentage 34589 1727204109.62552: done checking for max_fail_percentage 34589 1727204109.62552: checking to see if all hosts have failed and the running result is not ok 34589 1727204109.62553: done checking to see if all hosts have failed 34589 1727204109.62554: getting the remaining hosts for this loop 34589 1727204109.62555: done getting the remaining hosts for this loop 34589 1727204109.62559: getting the next task for host managed-node1 34589 1727204109.62566: done getting next task for host managed-node1 34589 1727204109.62568: ^ task is: TASK: Initialize the connection_failed flag 34589 1727204109.62571: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204109.62574: getting variables 34589 1727204109.62578: in VariableManager get_vars() 34589 1727204109.62618: Calling all_inventory to load vars for managed-node1 34589 1727204109.62621: Calling groups_inventory to load vars for managed-node1 34589 1727204109.62624: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204109.62635: Calling all_plugins_play to load vars for managed-node1 34589 1727204109.62638: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204109.62640: Calling groups_plugins_play to load vars for managed-node1 34589 1727204109.63174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204109.63629: done with get_vars() 34589 1727204109.63642: done getting variables 34589 1727204109.63797: WORKER PROCESS EXITING 34589 1727204109.63834: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize the connection_failed flag] *********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:23 Tuesday 24 September 2024 14:55:09 -0400 (0:00:00.051) 0:00:09.773 ***** 34589 1727204109.63865: entering _queue_task() for managed-node1/set_fact 34589 1727204109.65118: worker is 1 (out of 1 available) 34589 1727204109.65131: exiting _queue_task() for managed-node1/set_fact 34589 1727204109.65141: done queuing things up, now waiting for results queue to drain 34589 1727204109.65143: waiting for pending results... 34589 1727204109.65337: running TaskExecutor() for managed-node1/TASK: Initialize the connection_failed flag 34589 1727204109.65463: in run() - task 028d2410-947f-a9c6-cddc-00000000000f 34589 1727204109.65491: variable 'ansible_search_path' from source: unknown 34589 1727204109.65537: calling self._execute() 34589 1727204109.65638: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204109.65651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204109.65664: variable 'omit' from source: magic vars 34589 1727204109.66172: variable 'ansible_distribution_major_version' from source: facts 34589 1727204109.66193: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204109.66204: variable 'omit' from source: magic vars 34589 1727204109.66242: variable 'omit' from source: magic vars 34589 1727204109.66282: variable 'omit' from source: magic vars 34589 1727204109.66336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204109.66462: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204109.66465: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204109.66467: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204109.66469: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204109.66481: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204109.66489: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204109.66495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204109.66603: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204109.66617: Set connection var ansible_shell_executable to /bin/sh 34589 1727204109.66630: Set connection var ansible_timeout to 10 34589 1727204109.66636: Set connection var ansible_shell_type to sh 34589 1727204109.66646: Set connection var ansible_connection to ssh 34589 1727204109.66654: Set connection var ansible_pipelining to False 34589 1727204109.66691: variable 'ansible_shell_executable' from source: unknown 34589 1727204109.66701: variable 'ansible_connection' from source: unknown 34589 1727204109.66786: variable 'ansible_module_compression' from source: unknown 34589 1727204109.66790: variable 'ansible_shell_type' from source: unknown 34589 1727204109.66793: variable 'ansible_shell_executable' from source: unknown 34589 1727204109.66795: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204109.66796: variable 'ansible_pipelining' from source: unknown 34589 1727204109.66798: variable 'ansible_timeout' from source: unknown 34589 1727204109.66800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204109.66912: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204109.66945: variable 'omit' from source: magic vars 34589 1727204109.66956: starting attempt loop 34589 1727204109.66963: running the handler 34589 1727204109.67016: handler run complete 34589 1727204109.67019: attempt loop complete, returning result 34589 1727204109.67021: _execute() done 34589 1727204109.67032: dumping result to json 34589 1727204109.67039: done dumping result, returning 34589 1727204109.67049: done running TaskExecutor() for managed-node1/TASK: Initialize the connection_failed flag [028d2410-947f-a9c6-cddc-00000000000f] 34589 1727204109.67057: sending task result for task 028d2410-947f-a9c6-cddc-00000000000f 34589 1727204109.67274: done sending task result for task 028d2410-947f-a9c6-cddc-00000000000f 34589 1727204109.67279: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "connection_failed": false }, "changed": false } 34589 1727204109.67342: no more pending results, returning what we have 34589 1727204109.67346: results queue empty 34589 1727204109.67347: checking for any_errors_fatal 34589 1727204109.67355: done checking for any_errors_fatal 34589 1727204109.67356: checking for max_fail_percentage 34589 1727204109.67358: done checking for max_fail_percentage 34589 1727204109.67359: checking to see if all hosts have failed and the running result is not ok 34589 1727204109.67360: done checking to see if all hosts have failed 34589 1727204109.67361: getting the remaining hosts for this loop 34589 1727204109.67362: done getting the remaining hosts for this loop 34589 1727204109.67366: getting the next task for host managed-node1 34589 1727204109.67373: done getting next task for host managed-node1 34589 1727204109.67381: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34589 1727204109.67385: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204109.67399: getting variables 34589 1727204109.67401: in VariableManager get_vars() 34589 1727204109.67561: Calling all_inventory to load vars for managed-node1 34589 1727204109.67564: Calling groups_inventory to load vars for managed-node1 34589 1727204109.67567: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204109.67661: Calling all_plugins_play to load vars for managed-node1 34589 1727204109.67665: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204109.67669: Calling groups_plugins_play to load vars for managed-node1 34589 1727204109.67928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204109.68167: done with get_vars() 34589 1727204109.68182: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:55:09 -0400 (0:00:00.044) 0:00:09.817 ***** 34589 1727204109.68291: entering _queue_task() for managed-node1/include_tasks 34589 1727204109.68694: worker is 1 (out of 1 available) 34589 1727204109.68710: exiting _queue_task() for managed-node1/include_tasks 34589 1727204109.68723: done queuing things up, now waiting for results queue to drain 34589 1727204109.68724: waiting for pending results... 34589 1727204109.69012: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34589 1727204109.69151: in run() - task 028d2410-947f-a9c6-cddc-000000000017 34589 1727204109.69177: variable 'ansible_search_path' from source: unknown 34589 1727204109.69187: variable 'ansible_search_path' from source: unknown 34589 1727204109.69240: calling self._execute() 34589 1727204109.69428: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204109.69432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204109.69435: variable 'omit' from source: magic vars 34589 1727204109.69890: variable 'ansible_distribution_major_version' from source: facts 34589 1727204109.69911: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204109.69971: _execute() done 34589 1727204109.69974: dumping result to json 34589 1727204109.69979: done dumping result, returning 34589 1727204109.69982: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [028d2410-947f-a9c6-cddc-000000000017] 34589 1727204109.69989: sending task result for task 028d2410-947f-a9c6-cddc-000000000017 34589 1727204109.70120: no more pending results, returning what we have 34589 1727204109.70127: in VariableManager get_vars() 34589 1727204109.70379: Calling all_inventory to load vars for managed-node1 34589 1727204109.70385: Calling groups_inventory to load vars for managed-node1 34589 1727204109.70389: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204109.70401: Calling all_plugins_play to load vars for managed-node1 34589 1727204109.70404: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204109.70410: Calling groups_plugins_play to load vars for managed-node1 34589 1727204109.70739: done sending task result for task 028d2410-947f-a9c6-cddc-000000000017 34589 1727204109.70742: WORKER PROCESS EXITING 34589 1727204109.70764: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204109.70997: done with get_vars() 34589 1727204109.71005: variable 'ansible_search_path' from source: unknown 34589 1727204109.71008: variable 'ansible_search_path' from source: unknown 34589 1727204109.71059: we have included files to process 34589 1727204109.71061: generating all_blocks data 34589 1727204109.71062: done generating all_blocks data 34589 1727204109.71066: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 34589 1727204109.71067: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 34589 1727204109.71069: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 34589 1727204109.71814: done processing included file 34589 1727204109.71816: iterating over new_blocks loaded from include file 34589 1727204109.71818: in VariableManager get_vars() 34589 1727204109.71839: done with get_vars() 34589 1727204109.71841: filtering new block on tags 34589 1727204109.71858: done filtering new block on tags 34589 1727204109.71860: in VariableManager get_vars() 34589 1727204109.71884: done with get_vars() 34589 1727204109.71891: filtering new block on tags 34589 1727204109.71914: done filtering new block on tags 34589 1727204109.71917: in VariableManager get_vars() 34589 1727204109.71938: done with get_vars() 34589 1727204109.71940: filtering new block on tags 34589 1727204109.71957: done filtering new block on tags 34589 1727204109.71959: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 34589 1727204109.71964: extending task lists for all hosts with included blocks 34589 1727204109.72828: done extending task lists 34589 1727204109.72830: done processing included files 34589 1727204109.72831: results queue empty 34589 1727204109.72831: checking for any_errors_fatal 34589 1727204109.72834: done checking for any_errors_fatal 34589 1727204109.72835: checking for max_fail_percentage 34589 1727204109.72836: done checking for max_fail_percentage 34589 1727204109.72837: checking to see if all hosts have failed and the running result is not ok 34589 1727204109.72838: done checking to see if all hosts have failed 34589 1727204109.72839: getting the remaining hosts for this loop 34589 1727204109.72840: done getting the remaining hosts for this loop 34589 1727204109.72842: getting the next task for host managed-node1 34589 1727204109.72846: done getting next task for host managed-node1 34589 1727204109.72849: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 34589 1727204109.72852: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204109.72869: getting variables 34589 1727204109.72870: in VariableManager get_vars() 34589 1727204109.72887: Calling all_inventory to load vars for managed-node1 34589 1727204109.72890: Calling groups_inventory to load vars for managed-node1 34589 1727204109.72892: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204109.72898: Calling all_plugins_play to load vars for managed-node1 34589 1727204109.72900: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204109.72903: Calling groups_plugins_play to load vars for managed-node1 34589 1727204109.73095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204109.73321: done with get_vars() 34589 1727204109.73331: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:55:09 -0400 (0:00:00.051) 0:00:09.869 ***** 34589 1727204109.73434: entering _queue_task() for managed-node1/setup 34589 1727204109.74070: worker is 1 (out of 1 available) 34589 1727204109.74089: exiting _queue_task() for managed-node1/setup 34589 1727204109.74100: done queuing things up, now waiting for results queue to drain 34589 1727204109.74101: waiting for pending results... 34589 1727204109.74388: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 34589 1727204109.74474: in run() - task 028d2410-947f-a9c6-cddc-00000000038e 34589 1727204109.74493: variable 'ansible_search_path' from source: unknown 34589 1727204109.74497: variable 'ansible_search_path' from source: unknown 34589 1727204109.74527: calling self._execute() 34589 1727204109.74593: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204109.74597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204109.74604: variable 'omit' from source: magic vars 34589 1727204109.74880: variable 'ansible_distribution_major_version' from source: facts 34589 1727204109.74890: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204109.75033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204109.76774: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204109.76780: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204109.76784: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204109.76820: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204109.76845: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204109.77000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204109.77018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204109.77021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204109.77024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204109.77037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204109.77123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204109.77145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204109.77186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204109.77230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204109.77257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204109.77356: variable '__network_required_facts' from source: role '' defaults 34589 1727204109.77364: variable 'ansible_facts' from source: unknown 34589 1727204109.77425: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 34589 1727204109.77430: when evaluation is False, skipping this task 34589 1727204109.77433: _execute() done 34589 1727204109.77444: dumping result to json 34589 1727204109.77447: done dumping result, returning 34589 1727204109.77450: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [028d2410-947f-a9c6-cddc-00000000038e] 34589 1727204109.77470: sending task result for task 028d2410-947f-a9c6-cddc-00000000038e 34589 1727204109.77568: done sending task result for task 028d2410-947f-a9c6-cddc-00000000038e 34589 1727204109.77571: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34589 1727204109.77622: no more pending results, returning what we have 34589 1727204109.77625: results queue empty 34589 1727204109.77626: checking for any_errors_fatal 34589 1727204109.77627: done checking for any_errors_fatal 34589 1727204109.77628: checking for max_fail_percentage 34589 1727204109.77629: done checking for max_fail_percentage 34589 1727204109.77630: checking to see if all hosts have failed and the running result is not ok 34589 1727204109.77631: done checking to see if all hosts have failed 34589 1727204109.77631: getting the remaining hosts for this loop 34589 1727204109.77632: done getting the remaining hosts for this loop 34589 1727204109.77636: getting the next task for host managed-node1 34589 1727204109.77644: done getting next task for host managed-node1 34589 1727204109.77647: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 34589 1727204109.77650: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204109.77666: getting variables 34589 1727204109.77667: in VariableManager get_vars() 34589 1727204109.77708: Calling all_inventory to load vars for managed-node1 34589 1727204109.77711: Calling groups_inventory to load vars for managed-node1 34589 1727204109.77713: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204109.77723: Calling all_plugins_play to load vars for managed-node1 34589 1727204109.77725: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204109.77728: Calling groups_plugins_play to load vars for managed-node1 34589 1727204109.77989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204109.78180: done with get_vars() 34589 1727204109.78188: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:55:09 -0400 (0:00:00.048) 0:00:09.917 ***** 34589 1727204109.78263: entering _queue_task() for managed-node1/stat 34589 1727204109.78470: worker is 1 (out of 1 available) 34589 1727204109.78485: exiting _queue_task() for managed-node1/stat 34589 1727204109.78498: done queuing things up, now waiting for results queue to drain 34589 1727204109.78499: waiting for pending results... 34589 1727204109.78992: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 34589 1727204109.78998: in run() - task 028d2410-947f-a9c6-cddc-000000000390 34589 1727204109.79002: variable 'ansible_search_path' from source: unknown 34589 1727204109.79004: variable 'ansible_search_path' from source: unknown 34589 1727204109.79010: calling self._execute() 34589 1727204109.79124: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204109.79128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204109.79139: variable 'omit' from source: magic vars 34589 1727204109.79724: variable 'ansible_distribution_major_version' from source: facts 34589 1727204109.79743: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204109.80064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204109.80503: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204109.80556: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204109.80613: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204109.80661: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204109.80792: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204109.80843: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204109.80962: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204109.80966: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204109.81071: variable '__network_is_ostree' from source: set_fact 34589 1727204109.81080: Evaluated conditional (not __network_is_ostree is defined): False 34589 1727204109.81083: when evaluation is False, skipping this task 34589 1727204109.81086: _execute() done 34589 1727204109.81101: dumping result to json 34589 1727204109.81104: done dumping result, returning 34589 1727204109.81107: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [028d2410-947f-a9c6-cddc-000000000390] 34589 1727204109.81109: sending task result for task 028d2410-947f-a9c6-cddc-000000000390 34589 1727204109.81433: done sending task result for task 028d2410-947f-a9c6-cddc-000000000390 34589 1727204109.81436: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 34589 1727204109.81649: no more pending results, returning what we have 34589 1727204109.81653: results queue empty 34589 1727204109.81654: checking for any_errors_fatal 34589 1727204109.81662: done checking for any_errors_fatal 34589 1727204109.81663: checking for max_fail_percentage 34589 1727204109.81665: done checking for max_fail_percentage 34589 1727204109.81666: checking to see if all hosts have failed and the running result is not ok 34589 1727204109.81667: done checking to see if all hosts have failed 34589 1727204109.81668: getting the remaining hosts for this loop 34589 1727204109.81670: done getting the remaining hosts for this loop 34589 1727204109.81674: getting the next task for host managed-node1 34589 1727204109.81683: done getting next task for host managed-node1 34589 1727204109.81693: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 34589 1727204109.81701: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204109.81720: getting variables 34589 1727204109.81722: in VariableManager get_vars() 34589 1727204109.81764: Calling all_inventory to load vars for managed-node1 34589 1727204109.81767: Calling groups_inventory to load vars for managed-node1 34589 1727204109.81769: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204109.81967: Calling all_plugins_play to load vars for managed-node1 34589 1727204109.81972: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204109.81982: Calling groups_plugins_play to load vars for managed-node1 34589 1727204109.82384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204109.83005: done with get_vars() 34589 1727204109.83022: done getting variables 34589 1727204109.83193: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:55:09 -0400 (0:00:00.049) 0:00:09.967 ***** 34589 1727204109.83234: entering _queue_task() for managed-node1/set_fact 34589 1727204109.84068: worker is 1 (out of 1 available) 34589 1727204109.84082: exiting _queue_task() for managed-node1/set_fact 34589 1727204109.84094: done queuing things up, now waiting for results queue to drain 34589 1727204109.84095: waiting for pending results... 34589 1727204109.84730: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 34589 1727204109.85090: in run() - task 028d2410-947f-a9c6-cddc-000000000391 34589 1727204109.85094: variable 'ansible_search_path' from source: unknown 34589 1727204109.85097: variable 'ansible_search_path' from source: unknown 34589 1727204109.85100: calling self._execute() 34589 1727204109.85220: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204109.85314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204109.85331: variable 'omit' from source: magic vars 34589 1727204109.86052: variable 'ansible_distribution_major_version' from source: facts 34589 1727204109.86172: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204109.86752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204109.87872: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204109.87962: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204109.88134: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204109.88401: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204109.88497: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204109.88548: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204109.88588: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204109.88628: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204109.88795: variable '__network_is_ostree' from source: set_fact 34589 1727204109.88828: Evaluated conditional (not __network_is_ostree is defined): False 34589 1727204109.88916: when evaluation is False, skipping this task 34589 1727204109.88919: _execute() done 34589 1727204109.88924: dumping result to json 34589 1727204109.88926: done dumping result, returning 34589 1727204109.88929: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [028d2410-947f-a9c6-cddc-000000000391] 34589 1727204109.88931: sending task result for task 028d2410-947f-a9c6-cddc-000000000391 34589 1727204109.89093: done sending task result for task 028d2410-947f-a9c6-cddc-000000000391 skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 34589 1727204109.89146: no more pending results, returning what we have 34589 1727204109.89149: results queue empty 34589 1727204109.89150: checking for any_errors_fatal 34589 1727204109.89156: done checking for any_errors_fatal 34589 1727204109.89157: checking for max_fail_percentage 34589 1727204109.89158: done checking for max_fail_percentage 34589 1727204109.89159: checking to see if all hosts have failed and the running result is not ok 34589 1727204109.89160: done checking to see if all hosts have failed 34589 1727204109.89160: getting the remaining hosts for this loop 34589 1727204109.89161: done getting the remaining hosts for this loop 34589 1727204109.89165: getting the next task for host managed-node1 34589 1727204109.89174: done getting next task for host managed-node1 34589 1727204109.89179: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 34589 1727204109.89183: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204109.89197: getting variables 34589 1727204109.89200: in VariableManager get_vars() 34589 1727204109.89240: Calling all_inventory to load vars for managed-node1 34589 1727204109.89243: Calling groups_inventory to load vars for managed-node1 34589 1727204109.89245: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204109.89254: Calling all_plugins_play to load vars for managed-node1 34589 1727204109.89256: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204109.89259: Calling groups_plugins_play to load vars for managed-node1 34589 1727204109.89560: WORKER PROCESS EXITING 34589 1727204109.89586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204109.89800: done with get_vars() 34589 1727204109.89811: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:55:09 -0400 (0:00:00.066) 0:00:10.034 ***** 34589 1727204109.89914: entering _queue_task() for managed-node1/service_facts 34589 1727204109.89916: Creating lock for service_facts 34589 1727204109.90366: worker is 1 (out of 1 available) 34589 1727204109.90383: exiting _queue_task() for managed-node1/service_facts 34589 1727204109.90399: done queuing things up, now waiting for results queue to drain 34589 1727204109.90400: waiting for pending results... 34589 1727204109.90697: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 34589 1727204109.90783: in run() - task 028d2410-947f-a9c6-cddc-000000000393 34589 1727204109.90796: variable 'ansible_search_path' from source: unknown 34589 1727204109.90805: variable 'ansible_search_path' from source: unknown 34589 1727204109.90845: calling self._execute() 34589 1727204109.90935: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204109.90939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204109.91011: variable 'omit' from source: magic vars 34589 1727204109.91325: variable 'ansible_distribution_major_version' from source: facts 34589 1727204109.91336: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204109.91349: variable 'omit' from source: magic vars 34589 1727204109.91421: variable 'omit' from source: magic vars 34589 1727204109.91461: variable 'omit' from source: magic vars 34589 1727204109.91503: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204109.91543: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204109.91578: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204109.91598: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204109.91663: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204109.91667: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204109.91669: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204109.91672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204109.91995: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204109.91999: Set connection var ansible_shell_executable to /bin/sh 34589 1727204109.92001: Set connection var ansible_timeout to 10 34589 1727204109.92004: Set connection var ansible_shell_type to sh 34589 1727204109.92006: Set connection var ansible_connection to ssh 34589 1727204109.92008: Set connection var ansible_pipelining to False 34589 1727204109.92010: variable 'ansible_shell_executable' from source: unknown 34589 1727204109.92012: variable 'ansible_connection' from source: unknown 34589 1727204109.92014: variable 'ansible_module_compression' from source: unknown 34589 1727204109.92016: variable 'ansible_shell_type' from source: unknown 34589 1727204109.92018: variable 'ansible_shell_executable' from source: unknown 34589 1727204109.92021: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204109.92023: variable 'ansible_pipelining' from source: unknown 34589 1727204109.92025: variable 'ansible_timeout' from source: unknown 34589 1727204109.92027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204109.92189: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34589 1727204109.92199: variable 'omit' from source: magic vars 34589 1727204109.92211: starting attempt loop 34589 1727204109.92218: running the handler 34589 1727204109.92232: _low_level_execute_command(): starting 34589 1727204109.92253: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204109.93315: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204109.93438: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204109.93442: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204109.93445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204109.93529: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204109.95345: stdout chunk (state=3): >>>/root <<< 34589 1727204109.95597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204109.95602: stdout chunk (state=3): >>><<< 34589 1727204109.95605: stderr chunk (state=3): >>><<< 34589 1727204109.95640: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204109.95760: _low_level_execute_command(): starting 34589 1727204109.95764: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204109.9565144-35961-105316151882436 `" && echo ansible-tmp-1727204109.9565144-35961-105316151882436="` echo /root/.ansible/tmp/ansible-tmp-1727204109.9565144-35961-105316151882436 `" ) && sleep 0' 34589 1727204109.96495: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204109.96521: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204109.96546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204109.96574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204109.96601: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204109.96693: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204109.96739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204109.96757: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204109.96819: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204109.96926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204109.99053: stdout chunk (state=3): >>>ansible-tmp-1727204109.9565144-35961-105316151882436=/root/.ansible/tmp/ansible-tmp-1727204109.9565144-35961-105316151882436 <<< 34589 1727204109.99192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204109.99287: stderr chunk (state=3): >>><<< 34589 1727204109.99291: stdout chunk (state=3): >>><<< 34589 1727204109.99294: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204109.9565144-35961-105316151882436=/root/.ansible/tmp/ansible-tmp-1727204109.9565144-35961-105316151882436 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204109.99336: variable 'ansible_module_compression' from source: unknown 34589 1727204109.99431: ANSIBALLZ: Using lock for service_facts 34589 1727204109.99434: ANSIBALLZ: Acquiring lock 34589 1727204109.99437: ANSIBALLZ: Lock acquired: 140222016059824 34589 1727204109.99442: ANSIBALLZ: Creating module 34589 1727204110.09540: ANSIBALLZ: Writing module into payload 34589 1727204110.09679: ANSIBALLZ: Writing module 34589 1727204110.09683: ANSIBALLZ: Renaming module 34589 1727204110.09687: ANSIBALLZ: Done creating module 34589 1727204110.09689: variable 'ansible_facts' from source: unknown 34589 1727204110.09897: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204109.9565144-35961-105316151882436/AnsiballZ_service_facts.py 34589 1727204110.10000: Sending initial data 34589 1727204110.10004: Sent initial data (162 bytes) 34589 1727204110.10692: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204110.10712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204110.11139: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204110.12794: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 34589 1727204110.12797: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 34589 1727204110.12817: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 34589 1727204110.12821: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204110.12923: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204110.13011: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmp3_j3g18p /root/.ansible/tmp/ansible-tmp-1727204109.9565144-35961-105316151882436/AnsiballZ_service_facts.py <<< 34589 1727204110.13014: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204109.9565144-35961-105316151882436/AnsiballZ_service_facts.py" <<< 34589 1727204110.13087: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmp3_j3g18p" to remote "/root/.ansible/tmp/ansible-tmp-1727204109.9565144-35961-105316151882436/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204109.9565144-35961-105316151882436/AnsiballZ_service_facts.py" <<< 34589 1727204110.14254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204110.14258: stdout chunk (state=3): >>><<< 34589 1727204110.14261: stderr chunk (state=3): >>><<< 34589 1727204110.14285: done transferring module to remote 34589 1727204110.14449: _low_level_execute_command(): starting 34589 1727204110.14453: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204109.9565144-35961-105316151882436/ /root/.ansible/tmp/ansible-tmp-1727204109.9565144-35961-105316151882436/AnsiballZ_service_facts.py && sleep 0' 34589 1727204110.15592: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204110.15604: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204110.15615: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204110.15729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204110.17680: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204110.17720: stderr chunk (state=3): >>><<< 34589 1727204110.17722: stdout chunk (state=3): >>><<< 34589 1727204110.17787: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204110.17791: _low_level_execute_command(): starting 34589 1727204110.17794: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204109.9565144-35961-105316151882436/AnsiballZ_service_facts.py && sleep 0' 34589 1727204110.18397: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204110.18401: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204110.18486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204111.95222: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 34589 1727204111.95234: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.s<<< 34589 1727204111.95245: stdout chunk (state=3): >>>ervice", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source"<<< 34589 1727204111.95254: stdout chunk (state=3): >>>: "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 34589 1727204111.97329: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204111.97335: stderr chunk (state=3): >>><<< 34589 1727204111.97338: stdout chunk (state=3): >>><<< 34589 1727204111.97343: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204112.02483: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204109.9565144-35961-105316151882436/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204112.02603: _low_level_execute_command(): starting 34589 1727204112.02611: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204109.9565144-35961-105316151882436/ > /dev/null 2>&1 && sleep 0' 34589 1727204112.03356: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204112.03360: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204112.03362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204112.03364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204112.03366: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204112.03368: stderr chunk (state=3): >>>debug2: match not found <<< 34589 1727204112.03370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204112.03372: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34589 1727204112.03382: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 34589 1727204112.03386: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 34589 1727204112.03388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204112.03390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204112.03617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204112.03711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204112.05781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204112.05785: stdout chunk (state=3): >>><<< 34589 1727204112.05787: stderr chunk (state=3): >>><<< 34589 1727204112.05796: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204112.05799: handler run complete 34589 1727204112.06182: variable 'ansible_facts' from source: unknown 34589 1727204112.06185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204112.06561: variable 'ansible_facts' from source: unknown 34589 1727204112.06673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204112.07069: attempt loop complete, returning result 34589 1727204112.07073: _execute() done 34589 1727204112.07078: dumping result to json 34589 1727204112.07384: done dumping result, returning 34589 1727204112.07387: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [028d2410-947f-a9c6-cddc-000000000393] 34589 1727204112.07390: sending task result for task 028d2410-947f-a9c6-cddc-000000000393 34589 1727204112.08933: done sending task result for task 028d2410-947f-a9c6-cddc-000000000393 34589 1727204112.08937: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34589 1727204112.09253: no more pending results, returning what we have 34589 1727204112.09256: results queue empty 34589 1727204112.09257: checking for any_errors_fatal 34589 1727204112.09259: done checking for any_errors_fatal 34589 1727204112.09260: checking for max_fail_percentage 34589 1727204112.09261: done checking for max_fail_percentage 34589 1727204112.09262: checking to see if all hosts have failed and the running result is not ok 34589 1727204112.09263: done checking to see if all hosts have failed 34589 1727204112.09264: getting the remaining hosts for this loop 34589 1727204112.09265: done getting the remaining hosts for this loop 34589 1727204112.09268: getting the next task for host managed-node1 34589 1727204112.09273: done getting next task for host managed-node1 34589 1727204112.09279: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 34589 1727204112.09283: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204112.09292: getting variables 34589 1727204112.09293: in VariableManager get_vars() 34589 1727204112.09322: Calling all_inventory to load vars for managed-node1 34589 1727204112.09324: Calling groups_inventory to load vars for managed-node1 34589 1727204112.09327: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204112.09334: Calling all_plugins_play to load vars for managed-node1 34589 1727204112.09337: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204112.09339: Calling groups_plugins_play to load vars for managed-node1 34589 1727204112.09730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204112.10265: done with get_vars() 34589 1727204112.10284: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:55:12 -0400 (0:00:02.204) 0:00:12.238 ***** 34589 1727204112.10390: entering _queue_task() for managed-node1/package_facts 34589 1727204112.10392: Creating lock for package_facts 34589 1727204112.10705: worker is 1 (out of 1 available) 34589 1727204112.10721: exiting _queue_task() for managed-node1/package_facts 34589 1727204112.10734: done queuing things up, now waiting for results queue to drain 34589 1727204112.10735: waiting for pending results... 34589 1727204112.10993: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 34589 1727204112.11038: in run() - task 028d2410-947f-a9c6-cddc-000000000394 34589 1727204112.11050: variable 'ansible_search_path' from source: unknown 34589 1727204112.11054: variable 'ansible_search_path' from source: unknown 34589 1727204112.11090: calling self._execute() 34589 1727204112.11163: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204112.11169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204112.11179: variable 'omit' from source: magic vars 34589 1727204112.11639: variable 'ansible_distribution_major_version' from source: facts 34589 1727204112.12083: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204112.12087: variable 'omit' from source: magic vars 34589 1727204112.12090: variable 'omit' from source: magic vars 34589 1727204112.12092: variable 'omit' from source: magic vars 34589 1727204112.12094: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204112.12096: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204112.12098: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204112.12100: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204112.12102: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204112.12105: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204112.12110: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204112.12112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204112.12114: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204112.12117: Set connection var ansible_shell_executable to /bin/sh 34589 1727204112.12119: Set connection var ansible_timeout to 10 34589 1727204112.12121: Set connection var ansible_shell_type to sh 34589 1727204112.12123: Set connection var ansible_connection to ssh 34589 1727204112.12126: Set connection var ansible_pipelining to False 34589 1727204112.12128: variable 'ansible_shell_executable' from source: unknown 34589 1727204112.12130: variable 'ansible_connection' from source: unknown 34589 1727204112.12132: variable 'ansible_module_compression' from source: unknown 34589 1727204112.12134: variable 'ansible_shell_type' from source: unknown 34589 1727204112.12137: variable 'ansible_shell_executable' from source: unknown 34589 1727204112.12139: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204112.12140: variable 'ansible_pipelining' from source: unknown 34589 1727204112.12143: variable 'ansible_timeout' from source: unknown 34589 1727204112.12145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204112.12148: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34589 1727204112.12151: variable 'omit' from source: magic vars 34589 1727204112.12153: starting attempt loop 34589 1727204112.12156: running the handler 34589 1727204112.12158: _low_level_execute_command(): starting 34589 1727204112.12161: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204112.12808: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204112.12824: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204112.12840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204112.12858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204112.12874: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204112.12890: stderr chunk (state=3): >>>debug2: match not found <<< 34589 1727204112.12989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204112.13195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204112.13291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204112.15073: stdout chunk (state=3): >>>/root <<< 34589 1727204112.15205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204112.15342: stderr chunk (state=3): >>><<< 34589 1727204112.15346: stdout chunk (state=3): >>><<< 34589 1727204112.15366: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204112.15390: _low_level_execute_command(): starting 34589 1727204112.15453: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204112.1537294-36044-194902286005896 `" && echo ansible-tmp-1727204112.1537294-36044-194902286005896="` echo /root/.ansible/tmp/ansible-tmp-1727204112.1537294-36044-194902286005896 `" ) && sleep 0' 34589 1727204112.16372: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204112.16396: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204112.16422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204112.16441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204112.16489: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204112.16568: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204112.16596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204112.16629: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204112.16748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204112.18828: stdout chunk (state=3): >>>ansible-tmp-1727204112.1537294-36044-194902286005896=/root/.ansible/tmp/ansible-tmp-1727204112.1537294-36044-194902286005896 <<< 34589 1727204112.18997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204112.19000: stdout chunk (state=3): >>><<< 34589 1727204112.19003: stderr chunk (state=3): >>><<< 34589 1727204112.19285: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204112.1537294-36044-194902286005896=/root/.ansible/tmp/ansible-tmp-1727204112.1537294-36044-194902286005896 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204112.19289: variable 'ansible_module_compression' from source: unknown 34589 1727204112.19310: ANSIBALLZ: Using lock for package_facts 34589 1727204112.19318: ANSIBALLZ: Acquiring lock 34589 1727204112.19326: ANSIBALLZ: Lock acquired: 140222010702928 34589 1727204112.19334: ANSIBALLZ: Creating module 34589 1727204112.68816: ANSIBALLZ: Writing module into payload 34589 1727204112.69211: ANSIBALLZ: Writing module 34589 1727204112.69305: ANSIBALLZ: Renaming module 34589 1727204112.69318: ANSIBALLZ: Done creating module 34589 1727204112.69346: variable 'ansible_facts' from source: unknown 34589 1727204112.69762: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204112.1537294-36044-194902286005896/AnsiballZ_package_facts.py 34589 1727204112.70153: Sending initial data 34589 1727204112.70157: Sent initial data (162 bytes) 34589 1727204112.71447: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204112.71464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204112.71485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204112.71497: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204112.71589: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204112.71660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204112.71803: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204112.71882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204112.73797: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 34589 1727204112.73801: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204112.73857: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204112.73972: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmpqvr1fvby /root/.ansible/tmp/ansible-tmp-1727204112.1537294-36044-194902286005896/AnsiballZ_package_facts.py <<< 34589 1727204112.73977: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204112.1537294-36044-194902286005896/AnsiballZ_package_facts.py" <<< 34589 1727204112.74053: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmpqvr1fvby" to remote "/root/.ansible/tmp/ansible-tmp-1727204112.1537294-36044-194902286005896/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204112.1537294-36044-194902286005896/AnsiballZ_package_facts.py" <<< 34589 1727204112.76989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204112.76993: stderr chunk (state=3): >>><<< 34589 1727204112.76996: stdout chunk (state=3): >>><<< 34589 1727204112.76998: done transferring module to remote 34589 1727204112.77001: _low_level_execute_command(): starting 34589 1727204112.77003: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204112.1537294-36044-194902286005896/ /root/.ansible/tmp/ansible-tmp-1727204112.1537294-36044-194902286005896/AnsiballZ_package_facts.py && sleep 0' 34589 1727204112.78667: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204112.78685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204112.78702: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204112.78860: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204112.78871: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204112.78968: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204112.79084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204112.81057: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204112.81343: stderr chunk (state=3): >>><<< 34589 1727204112.81347: stdout chunk (state=3): >>><<< 34589 1727204112.81350: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204112.81353: _low_level_execute_command(): starting 34589 1727204112.81356: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204112.1537294-36044-194902286005896/AnsiballZ_package_facts.py && sleep 0' 34589 1727204112.82517: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204112.82534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204112.82570: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204112.82636: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204112.82679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204112.82801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204112.82896: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204113.30178: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 34589 1727204113.30440: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 34589 1727204113.32464: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204113.32528: stderr chunk (state=3): >>><<< 34589 1727204113.32531: stdout chunk (state=3): >>><<< 34589 1727204113.32783: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204113.39391: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204112.1537294-36044-194902286005896/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204113.39883: _low_level_execute_command(): starting 34589 1727204113.39887: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204112.1537294-36044-194902286005896/ > /dev/null 2>&1 && sleep 0' 34589 1727204113.41048: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204113.41063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204113.41074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204113.41536: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204113.41549: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204113.41990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204113.43913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204113.43995: stderr chunk (state=3): >>><<< 34589 1727204113.44005: stdout chunk (state=3): >>><<< 34589 1727204113.44035: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204113.44049: handler run complete 34589 1727204113.47183: variable 'ansible_facts' from source: unknown 34589 1727204113.48983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204113.54983: variable 'ansible_facts' from source: unknown 34589 1727204113.66985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204113.68717: attempt loop complete, returning result 34589 1727204113.68743: _execute() done 34589 1727204113.68752: dumping result to json 34589 1727204113.69179: done dumping result, returning 34589 1727204113.69393: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [028d2410-947f-a9c6-cddc-000000000394] 34589 1727204113.69403: sending task result for task 028d2410-947f-a9c6-cddc-000000000394 34589 1727204113.73825: done sending task result for task 028d2410-947f-a9c6-cddc-000000000394 34589 1727204113.73829: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34589 1727204113.73932: no more pending results, returning what we have 34589 1727204113.73934: results queue empty 34589 1727204113.73935: checking for any_errors_fatal 34589 1727204113.73939: done checking for any_errors_fatal 34589 1727204113.73940: checking for max_fail_percentage 34589 1727204113.73942: done checking for max_fail_percentage 34589 1727204113.73943: checking to see if all hosts have failed and the running result is not ok 34589 1727204113.73944: done checking to see if all hosts have failed 34589 1727204113.73945: getting the remaining hosts for this loop 34589 1727204113.73946: done getting the remaining hosts for this loop 34589 1727204113.73949: getting the next task for host managed-node1 34589 1727204113.73954: done getting next task for host managed-node1 34589 1727204113.73957: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 34589 1727204113.73960: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204113.73969: getting variables 34589 1727204113.73970: in VariableManager get_vars() 34589 1727204113.74001: Calling all_inventory to load vars for managed-node1 34589 1727204113.74004: Calling groups_inventory to load vars for managed-node1 34589 1727204113.74009: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204113.74018: Calling all_plugins_play to load vars for managed-node1 34589 1727204113.74021: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204113.74024: Calling groups_plugins_play to load vars for managed-node1 34589 1727204113.76598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204113.79964: done with get_vars() 34589 1727204113.79991: done getting variables 34589 1727204113.80056: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:55:13 -0400 (0:00:01.700) 0:00:13.938 ***** 34589 1727204113.80396: entering _queue_task() for managed-node1/debug 34589 1727204113.80939: worker is 1 (out of 1 available) 34589 1727204113.80953: exiting _queue_task() for managed-node1/debug 34589 1727204113.80966: done queuing things up, now waiting for results queue to drain 34589 1727204113.80967: waiting for pending results... 34589 1727204113.81793: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 34589 1727204113.81798: in run() - task 028d2410-947f-a9c6-cddc-000000000018 34589 1727204113.81801: variable 'ansible_search_path' from source: unknown 34589 1727204113.81804: variable 'ansible_search_path' from source: unknown 34589 1727204113.82121: calling self._execute() 34589 1727204113.82224: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204113.82228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204113.82239: variable 'omit' from source: magic vars 34589 1727204113.83249: variable 'ansible_distribution_major_version' from source: facts 34589 1727204113.83260: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204113.83267: variable 'omit' from source: magic vars 34589 1727204113.83602: variable 'omit' from source: magic vars 34589 1727204113.83980: variable 'network_provider' from source: set_fact 34589 1727204113.83984: variable 'omit' from source: magic vars 34589 1727204113.84065: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204113.84217: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204113.84235: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204113.84253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204113.84263: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204113.84401: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204113.84404: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204113.84410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204113.84528: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204113.84534: Set connection var ansible_shell_executable to /bin/sh 34589 1727204113.84537: Set connection var ansible_timeout to 10 34589 1727204113.84540: Set connection var ansible_shell_type to sh 34589 1727204113.84548: Set connection var ansible_connection to ssh 34589 1727204113.84553: Set connection var ansible_pipelining to False 34589 1727204113.84577: variable 'ansible_shell_executable' from source: unknown 34589 1727204113.84581: variable 'ansible_connection' from source: unknown 34589 1727204113.84583: variable 'ansible_module_compression' from source: unknown 34589 1727204113.84586: variable 'ansible_shell_type' from source: unknown 34589 1727204113.84589: variable 'ansible_shell_executable' from source: unknown 34589 1727204113.84591: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204113.84593: variable 'ansible_pipelining' from source: unknown 34589 1727204113.84596: variable 'ansible_timeout' from source: unknown 34589 1727204113.84600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204113.85010: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204113.85017: variable 'omit' from source: magic vars 34589 1727204113.85081: starting attempt loop 34589 1727204113.85084: running the handler 34589 1727204113.85189: handler run complete 34589 1727204113.85199: attempt loop complete, returning result 34589 1727204113.85202: _execute() done 34589 1727204113.85205: dumping result to json 34589 1727204113.85210: done dumping result, returning 34589 1727204113.85217: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [028d2410-947f-a9c6-cddc-000000000018] 34589 1727204113.85222: sending task result for task 028d2410-947f-a9c6-cddc-000000000018 ok: [managed-node1] => {} MSG: Using network provider: nm 34589 1727204113.85450: no more pending results, returning what we have 34589 1727204113.85453: results queue empty 34589 1727204113.85454: checking for any_errors_fatal 34589 1727204113.85463: done checking for any_errors_fatal 34589 1727204113.85463: checking for max_fail_percentage 34589 1727204113.85465: done checking for max_fail_percentage 34589 1727204113.85466: checking to see if all hosts have failed and the running result is not ok 34589 1727204113.85467: done checking to see if all hosts have failed 34589 1727204113.85467: getting the remaining hosts for this loop 34589 1727204113.85468: done getting the remaining hosts for this loop 34589 1727204113.85472: getting the next task for host managed-node1 34589 1727204113.85480: done getting next task for host managed-node1 34589 1727204113.85484: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34589 1727204113.85488: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204113.85501: getting variables 34589 1727204113.85503: in VariableManager get_vars() 34589 1727204113.85544: Calling all_inventory to load vars for managed-node1 34589 1727204113.85546: Calling groups_inventory to load vars for managed-node1 34589 1727204113.85548: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204113.85558: Calling all_plugins_play to load vars for managed-node1 34589 1727204113.85560: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204113.85563: Calling groups_plugins_play to load vars for managed-node1 34589 1727204113.86198: done sending task result for task 028d2410-947f-a9c6-cddc-000000000018 34589 1727204113.86202: WORKER PROCESS EXITING 34589 1727204113.87962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204113.90715: done with get_vars() 34589 1727204113.90746: done getting variables 34589 1727204113.90805: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:55:13 -0400 (0:00:00.104) 0:00:14.043 ***** 34589 1727204113.90838: entering _queue_task() for managed-node1/fail 34589 1727204113.91579: worker is 1 (out of 1 available) 34589 1727204113.91591: exiting _queue_task() for managed-node1/fail 34589 1727204113.91602: done queuing things up, now waiting for results queue to drain 34589 1727204113.91604: waiting for pending results... 34589 1727204113.92173: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34589 1727204113.92336: in run() - task 028d2410-947f-a9c6-cddc-000000000019 34589 1727204113.92499: variable 'ansible_search_path' from source: unknown 34589 1727204113.92509: variable 'ansible_search_path' from source: unknown 34589 1727204113.92546: calling self._execute() 34589 1727204113.92810: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204113.92819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204113.92832: variable 'omit' from source: magic vars 34589 1727204113.93474: variable 'ansible_distribution_major_version' from source: facts 34589 1727204113.93494: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204113.93621: variable 'network_state' from source: role '' defaults 34589 1727204113.93635: Evaluated conditional (network_state != {}): False 34589 1727204113.93641: when evaluation is False, skipping this task 34589 1727204113.93647: _execute() done 34589 1727204113.93653: dumping result to json 34589 1727204113.93661: done dumping result, returning 34589 1727204113.93676: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [028d2410-947f-a9c6-cddc-000000000019] 34589 1727204113.93688: sending task result for task 028d2410-947f-a9c6-cddc-000000000019 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34589 1727204113.93834: no more pending results, returning what we have 34589 1727204113.93838: results queue empty 34589 1727204113.93839: checking for any_errors_fatal 34589 1727204113.93847: done checking for any_errors_fatal 34589 1727204113.93847: checking for max_fail_percentage 34589 1727204113.93849: done checking for max_fail_percentage 34589 1727204113.93850: checking to see if all hosts have failed and the running result is not ok 34589 1727204113.93850: done checking to see if all hosts have failed 34589 1727204113.93851: getting the remaining hosts for this loop 34589 1727204113.93852: done getting the remaining hosts for this loop 34589 1727204113.93856: getting the next task for host managed-node1 34589 1727204113.93863: done getting next task for host managed-node1 34589 1727204113.93867: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34589 1727204113.93870: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204113.93887: getting variables 34589 1727204113.93889: in VariableManager get_vars() 34589 1727204113.93934: Calling all_inventory to load vars for managed-node1 34589 1727204113.93937: Calling groups_inventory to load vars for managed-node1 34589 1727204113.93940: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204113.93956: Calling all_plugins_play to load vars for managed-node1 34589 1727204113.93959: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204113.94122: Calling groups_plugins_play to load vars for managed-node1 34589 1727204113.94658: done sending task result for task 028d2410-947f-a9c6-cddc-000000000019 34589 1727204113.94662: WORKER PROCESS EXITING 34589 1727204113.95954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204113.97848: done with get_vars() 34589 1727204113.97873: done getting variables 34589 1727204113.97986: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:55:13 -0400 (0:00:00.071) 0:00:14.115 ***** 34589 1727204113.98020: entering _queue_task() for managed-node1/fail 34589 1727204113.98701: worker is 1 (out of 1 available) 34589 1727204113.98829: exiting _queue_task() for managed-node1/fail 34589 1727204113.98843: done queuing things up, now waiting for results queue to drain 34589 1727204113.98844: waiting for pending results... 34589 1727204113.99155: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34589 1727204113.99463: in run() - task 028d2410-947f-a9c6-cddc-00000000001a 34589 1727204113.99535: variable 'ansible_search_path' from source: unknown 34589 1727204113.99682: variable 'ansible_search_path' from source: unknown 34589 1727204113.99686: calling self._execute() 34589 1727204113.99823: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204114.00062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204114.00065: variable 'omit' from source: magic vars 34589 1727204114.00668: variable 'ansible_distribution_major_version' from source: facts 34589 1727204114.00729: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204114.00968: variable 'network_state' from source: role '' defaults 34589 1727204114.01053: Evaluated conditional (network_state != {}): False 34589 1727204114.01062: when evaluation is False, skipping this task 34589 1727204114.01070: _execute() done 34589 1727204114.01079: dumping result to json 34589 1727204114.01088: done dumping result, returning 34589 1727204114.01098: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [028d2410-947f-a9c6-cddc-00000000001a] 34589 1727204114.01113: sending task result for task 028d2410-947f-a9c6-cddc-00000000001a skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34589 1727204114.01266: no more pending results, returning what we have 34589 1727204114.01269: results queue empty 34589 1727204114.01270: checking for any_errors_fatal 34589 1727204114.01278: done checking for any_errors_fatal 34589 1727204114.01279: checking for max_fail_percentage 34589 1727204114.01281: done checking for max_fail_percentage 34589 1727204114.01282: checking to see if all hosts have failed and the running result is not ok 34589 1727204114.01282: done checking to see if all hosts have failed 34589 1727204114.01283: getting the remaining hosts for this loop 34589 1727204114.01284: done getting the remaining hosts for this loop 34589 1727204114.01288: getting the next task for host managed-node1 34589 1727204114.01295: done getting next task for host managed-node1 34589 1727204114.01298: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34589 1727204114.01302: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204114.01316: getting variables 34589 1727204114.01318: in VariableManager get_vars() 34589 1727204114.01356: Calling all_inventory to load vars for managed-node1 34589 1727204114.01359: Calling groups_inventory to load vars for managed-node1 34589 1727204114.01361: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204114.01371: Calling all_plugins_play to load vars for managed-node1 34589 1727204114.01373: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204114.01491: Calling groups_plugins_play to load vars for managed-node1 34589 1727204114.02257: done sending task result for task 028d2410-947f-a9c6-cddc-00000000001a 34589 1727204114.02262: WORKER PROCESS EXITING 34589 1727204114.04064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204114.07379: done with get_vars() 34589 1727204114.07413: done getting variables 34589 1727204114.07586: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:55:14 -0400 (0:00:00.095) 0:00:14.211 ***** 34589 1727204114.07620: entering _queue_task() for managed-node1/fail 34589 1727204114.08819: worker is 1 (out of 1 available) 34589 1727204114.08832: exiting _queue_task() for managed-node1/fail 34589 1727204114.08845: done queuing things up, now waiting for results queue to drain 34589 1727204114.08846: waiting for pending results... 34589 1727204114.09339: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34589 1727204114.09623: in run() - task 028d2410-947f-a9c6-cddc-00000000001b 34589 1727204114.09644: variable 'ansible_search_path' from source: unknown 34589 1727204114.09701: variable 'ansible_search_path' from source: unknown 34589 1727204114.09881: calling self._execute() 34589 1727204114.10020: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204114.10257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204114.10261: variable 'omit' from source: magic vars 34589 1727204114.11132: variable 'ansible_distribution_major_version' from source: facts 34589 1727204114.11395: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204114.11637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204114.18800: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204114.19074: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204114.19081: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204114.19292: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204114.19296: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204114.19403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204114.19439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204114.19469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204114.19551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204114.19637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204114.19853: variable 'ansible_distribution_major_version' from source: facts 34589 1727204114.19877: Evaluated conditional (ansible_distribution_major_version | int > 9): True 34589 1727204114.20119: variable 'ansible_distribution' from source: facts 34589 1727204114.20170: variable '__network_rh_distros' from source: role '' defaults 34589 1727204114.20188: Evaluated conditional (ansible_distribution in __network_rh_distros): True 34589 1727204114.20684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204114.20812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204114.20931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204114.20946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204114.21148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204114.21152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204114.21154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204114.21156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204114.21294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204114.21315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204114.21367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204114.21585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204114.21588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204114.21591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204114.21780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204114.22311: variable 'network_connections' from source: task vars 34589 1727204114.22327: variable 'interface' from source: set_fact 34589 1727204114.22473: variable 'interface' from source: set_fact 34589 1727204114.22512: variable 'interface' from source: set_fact 34589 1727204114.22679: variable 'interface' from source: set_fact 34589 1727204114.22693: variable 'network_state' from source: role '' defaults 34589 1727204114.22767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204114.23189: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204114.23294: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204114.23335: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204114.23404: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204114.23642: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204114.23653: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204114.23708: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204114.23747: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204114.23851: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 34589 1727204114.23971: when evaluation is False, skipping this task 34589 1727204114.23977: _execute() done 34589 1727204114.23981: dumping result to json 34589 1727204114.23984: done dumping result, returning 34589 1727204114.23987: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [028d2410-947f-a9c6-cddc-00000000001b] 34589 1727204114.23990: sending task result for task 028d2410-947f-a9c6-cddc-00000000001b skipping: [managed-node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 34589 1727204114.24315: no more pending results, returning what we have 34589 1727204114.24319: results queue empty 34589 1727204114.24320: checking for any_errors_fatal 34589 1727204114.24326: done checking for any_errors_fatal 34589 1727204114.24327: checking for max_fail_percentage 34589 1727204114.24328: done checking for max_fail_percentage 34589 1727204114.24329: checking to see if all hosts have failed and the running result is not ok 34589 1727204114.24330: done checking to see if all hosts have failed 34589 1727204114.24330: getting the remaining hosts for this loop 34589 1727204114.24332: done getting the remaining hosts for this loop 34589 1727204114.24336: getting the next task for host managed-node1 34589 1727204114.24344: done getting next task for host managed-node1 34589 1727204114.24348: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34589 1727204114.24351: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204114.24366: getting variables 34589 1727204114.24368: in VariableManager get_vars() 34589 1727204114.24411: Calling all_inventory to load vars for managed-node1 34589 1727204114.24414: Calling groups_inventory to load vars for managed-node1 34589 1727204114.24416: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204114.24427: Calling all_plugins_play to load vars for managed-node1 34589 1727204114.24430: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204114.24433: Calling groups_plugins_play to load vars for managed-node1 34589 1727204114.25238: done sending task result for task 028d2410-947f-a9c6-cddc-00000000001b 34589 1727204114.25242: WORKER PROCESS EXITING 34589 1727204114.27571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204114.29227: done with get_vars() 34589 1727204114.29299: done getting variables 34589 1727204114.29447: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:55:14 -0400 (0:00:00.218) 0:00:14.429 ***** 34589 1727204114.29480: entering _queue_task() for managed-node1/dnf 34589 1727204114.30226: worker is 1 (out of 1 available) 34589 1727204114.30238: exiting _queue_task() for managed-node1/dnf 34589 1727204114.30252: done queuing things up, now waiting for results queue to drain 34589 1727204114.30253: waiting for pending results... 34589 1727204114.30894: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34589 1727204114.31026: in run() - task 028d2410-947f-a9c6-cddc-00000000001c 34589 1727204114.31045: variable 'ansible_search_path' from source: unknown 34589 1727204114.31098: variable 'ansible_search_path' from source: unknown 34589 1727204114.31153: calling self._execute() 34589 1727204114.31327: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204114.31390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204114.31403: variable 'omit' from source: magic vars 34589 1727204114.32128: variable 'ansible_distribution_major_version' from source: facts 34589 1727204114.32142: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204114.32589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204114.36235: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204114.36313: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204114.36357: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204114.36435: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204114.36460: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204114.36549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204114.36625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204114.36629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204114.36656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204114.36682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204114.36818: variable 'ansible_distribution' from source: facts 34589 1727204114.36822: variable 'ansible_distribution_major_version' from source: facts 34589 1727204114.36841: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 34589 1727204114.36959: variable '__network_wireless_connections_defined' from source: role '' defaults 34589 1727204114.37169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204114.37172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204114.37175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204114.37191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204114.37205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204114.37251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204114.37278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204114.37299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204114.37338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204114.37359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204114.37399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204114.37422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204114.37490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204114.37494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204114.37503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204114.37660: variable 'network_connections' from source: task vars 34589 1727204114.37672: variable 'interface' from source: set_fact 34589 1727204114.37749: variable 'interface' from source: set_fact 34589 1727204114.37758: variable 'interface' from source: set_fact 34589 1727204114.37824: variable 'interface' from source: set_fact 34589 1727204114.37925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204114.38071: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204114.38112: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204114.38163: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204114.38251: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204114.38255: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204114.38262: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204114.38291: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204114.38318: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204114.38383: variable '__network_team_connections_defined' from source: role '' defaults 34589 1727204114.38612: variable 'network_connections' from source: task vars 34589 1727204114.38615: variable 'interface' from source: set_fact 34589 1727204114.38681: variable 'interface' from source: set_fact 34589 1727204114.38685: variable 'interface' from source: set_fact 34589 1727204114.38781: variable 'interface' from source: set_fact 34589 1727204114.38788: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 34589 1727204114.38791: when evaluation is False, skipping this task 34589 1727204114.38793: _execute() done 34589 1727204114.38796: dumping result to json 34589 1727204114.38797: done dumping result, returning 34589 1727204114.38800: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [028d2410-947f-a9c6-cddc-00000000001c] 34589 1727204114.38801: sending task result for task 028d2410-947f-a9c6-cddc-00000000001c skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 34589 1727204114.38933: no more pending results, returning what we have 34589 1727204114.38936: results queue empty 34589 1727204114.38937: checking for any_errors_fatal 34589 1727204114.38942: done checking for any_errors_fatal 34589 1727204114.38943: checking for max_fail_percentage 34589 1727204114.38945: done checking for max_fail_percentage 34589 1727204114.38946: checking to see if all hosts have failed and the running result is not ok 34589 1727204114.38946: done checking to see if all hosts have failed 34589 1727204114.38947: getting the remaining hosts for this loop 34589 1727204114.38949: done getting the remaining hosts for this loop 34589 1727204114.38953: getting the next task for host managed-node1 34589 1727204114.38960: done getting next task for host managed-node1 34589 1727204114.38963: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34589 1727204114.38966: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204114.39084: getting variables 34589 1727204114.39087: in VariableManager get_vars() 34589 1727204114.39131: Calling all_inventory to load vars for managed-node1 34589 1727204114.39134: Calling groups_inventory to load vars for managed-node1 34589 1727204114.39137: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204114.39142: done sending task result for task 028d2410-947f-a9c6-cddc-00000000001c 34589 1727204114.39145: WORKER PROCESS EXITING 34589 1727204114.39154: Calling all_plugins_play to load vars for managed-node1 34589 1727204114.39157: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204114.39159: Calling groups_plugins_play to load vars for managed-node1 34589 1727204114.40515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204114.42198: done with get_vars() 34589 1727204114.42226: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34589 1727204114.42308: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:55:14 -0400 (0:00:00.128) 0:00:14.558 ***** 34589 1727204114.42344: entering _queue_task() for managed-node1/yum 34589 1727204114.42346: Creating lock for yum 34589 1727204114.42907: worker is 1 (out of 1 available) 34589 1727204114.42916: exiting _queue_task() for managed-node1/yum 34589 1727204114.42927: done queuing things up, now waiting for results queue to drain 34589 1727204114.42928: waiting for pending results... 34589 1727204114.43002: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34589 1727204114.43122: in run() - task 028d2410-947f-a9c6-cddc-00000000001d 34589 1727204114.43134: variable 'ansible_search_path' from source: unknown 34589 1727204114.43138: variable 'ansible_search_path' from source: unknown 34589 1727204114.43181: calling self._execute() 34589 1727204114.43269: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204114.43277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204114.43288: variable 'omit' from source: magic vars 34589 1727204114.43658: variable 'ansible_distribution_major_version' from source: facts 34589 1727204114.43669: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204114.43889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204114.45994: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204114.46249: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204114.46253: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204114.46255: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204114.46258: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204114.46260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204114.46263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204114.46274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204114.46324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204114.46338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204114.46434: variable 'ansible_distribution_major_version' from source: facts 34589 1727204114.46581: Evaluated conditional (ansible_distribution_major_version | int < 8): False 34589 1727204114.46584: when evaluation is False, skipping this task 34589 1727204114.46586: _execute() done 34589 1727204114.46588: dumping result to json 34589 1727204114.46590: done dumping result, returning 34589 1727204114.46593: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [028d2410-947f-a9c6-cddc-00000000001d] 34589 1727204114.46595: sending task result for task 028d2410-947f-a9c6-cddc-00000000001d 34589 1727204114.46658: done sending task result for task 028d2410-947f-a9c6-cddc-00000000001d 34589 1727204114.46662: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 34589 1727204114.46717: no more pending results, returning what we have 34589 1727204114.46721: results queue empty 34589 1727204114.46722: checking for any_errors_fatal 34589 1727204114.46728: done checking for any_errors_fatal 34589 1727204114.46728: checking for max_fail_percentage 34589 1727204114.46730: done checking for max_fail_percentage 34589 1727204114.46731: checking to see if all hosts have failed and the running result is not ok 34589 1727204114.46732: done checking to see if all hosts have failed 34589 1727204114.46733: getting the remaining hosts for this loop 34589 1727204114.46734: done getting the remaining hosts for this loop 34589 1727204114.46742: getting the next task for host managed-node1 34589 1727204114.46750: done getting next task for host managed-node1 34589 1727204114.46754: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34589 1727204114.46757: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204114.46771: getting variables 34589 1727204114.46773: in VariableManager get_vars() 34589 1727204114.46813: Calling all_inventory to load vars for managed-node1 34589 1727204114.46817: Calling groups_inventory to load vars for managed-node1 34589 1727204114.46819: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204114.46829: Calling all_plugins_play to load vars for managed-node1 34589 1727204114.46832: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204114.46835: Calling groups_plugins_play to load vars for managed-node1 34589 1727204114.48322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204114.49883: done with get_vars() 34589 1727204114.49915: done getting variables 34589 1727204114.49969: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:55:14 -0400 (0:00:00.076) 0:00:14.634 ***** 34589 1727204114.50004: entering _queue_task() for managed-node1/fail 34589 1727204114.50478: worker is 1 (out of 1 available) 34589 1727204114.50489: exiting _queue_task() for managed-node1/fail 34589 1727204114.50500: done queuing things up, now waiting for results queue to drain 34589 1727204114.50501: waiting for pending results... 34589 1727204114.50709: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34589 1727204114.50821: in run() - task 028d2410-947f-a9c6-cddc-00000000001e 34589 1727204114.50840: variable 'ansible_search_path' from source: unknown 34589 1727204114.50844: variable 'ansible_search_path' from source: unknown 34589 1727204114.50881: calling self._execute() 34589 1727204114.50973: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204114.50980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204114.50990: variable 'omit' from source: magic vars 34589 1727204114.51468: variable 'ansible_distribution_major_version' from source: facts 34589 1727204114.51472: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204114.51534: variable '__network_wireless_connections_defined' from source: role '' defaults 34589 1727204114.51728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204114.54174: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204114.54254: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204114.54381: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204114.54385: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204114.54388: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204114.54582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204114.54586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204114.54589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204114.54621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204114.54624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204114.54737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204114.54801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204114.54829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204114.54874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204114.54888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204114.54931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204114.54964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204114.54998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204114.55039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204114.55052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204114.55240: variable 'network_connections' from source: task vars 34589 1727204114.55251: variable 'interface' from source: set_fact 34589 1727204114.55333: variable 'interface' from source: set_fact 34589 1727204114.55343: variable 'interface' from source: set_fact 34589 1727204114.55416: variable 'interface' from source: set_fact 34589 1727204114.55484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204114.55654: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204114.55689: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204114.55737: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204114.55761: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204114.55800: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204114.55823: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204114.55852: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204114.55874: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204114.55932: variable '__network_team_connections_defined' from source: role '' defaults 34589 1727204114.56168: variable 'network_connections' from source: task vars 34589 1727204114.56173: variable 'interface' from source: set_fact 34589 1727204114.56232: variable 'interface' from source: set_fact 34589 1727204114.56280: variable 'interface' from source: set_fact 34589 1727204114.56296: variable 'interface' from source: set_fact 34589 1727204114.56326: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 34589 1727204114.56330: when evaluation is False, skipping this task 34589 1727204114.56332: _execute() done 34589 1727204114.56334: dumping result to json 34589 1727204114.56339: done dumping result, returning 34589 1727204114.56346: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [028d2410-947f-a9c6-cddc-00000000001e] 34589 1727204114.56356: sending task result for task 028d2410-947f-a9c6-cddc-00000000001e 34589 1727204114.56457: done sending task result for task 028d2410-947f-a9c6-cddc-00000000001e 34589 1727204114.56460: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 34589 1727204114.56536: no more pending results, returning what we have 34589 1727204114.56540: results queue empty 34589 1727204114.56541: checking for any_errors_fatal 34589 1727204114.56546: done checking for any_errors_fatal 34589 1727204114.56547: checking for max_fail_percentage 34589 1727204114.56549: done checking for max_fail_percentage 34589 1727204114.56550: checking to see if all hosts have failed and the running result is not ok 34589 1727204114.56550: done checking to see if all hosts have failed 34589 1727204114.56551: getting the remaining hosts for this loop 34589 1727204114.56552: done getting the remaining hosts for this loop 34589 1727204114.56557: getting the next task for host managed-node1 34589 1727204114.56565: done getting next task for host managed-node1 34589 1727204114.56569: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 34589 1727204114.56572: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204114.56591: getting variables 34589 1727204114.56595: in VariableManager get_vars() 34589 1727204114.56639: Calling all_inventory to load vars for managed-node1 34589 1727204114.56642: Calling groups_inventory to load vars for managed-node1 34589 1727204114.56644: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204114.56655: Calling all_plugins_play to load vars for managed-node1 34589 1727204114.56658: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204114.56661: Calling groups_plugins_play to load vars for managed-node1 34589 1727204114.58987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204114.67674: done with get_vars() 34589 1727204114.67927: done getting variables 34589 1727204114.67974: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:55:14 -0400 (0:00:00.180) 0:00:14.814 ***** 34589 1727204114.68006: entering _queue_task() for managed-node1/package 34589 1727204114.68596: worker is 1 (out of 1 available) 34589 1727204114.68612: exiting _queue_task() for managed-node1/package 34589 1727204114.68627: done queuing things up, now waiting for results queue to drain 34589 1727204114.68628: waiting for pending results... 34589 1727204114.69223: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 34589 1727204114.69381: in run() - task 028d2410-947f-a9c6-cddc-00000000001f 34589 1727204114.69385: variable 'ansible_search_path' from source: unknown 34589 1727204114.69388: variable 'ansible_search_path' from source: unknown 34589 1727204114.69391: calling self._execute() 34589 1727204114.69475: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204114.69491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204114.69506: variable 'omit' from source: magic vars 34589 1727204114.69961: variable 'ansible_distribution_major_version' from source: facts 34589 1727204114.69965: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204114.70132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204114.70453: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204114.70532: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204114.70622: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204114.70658: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204114.70985: variable 'network_packages' from source: role '' defaults 34589 1727204114.71285: variable '__network_provider_setup' from source: role '' defaults 34589 1727204114.71288: variable '__network_service_name_default_nm' from source: role '' defaults 34589 1727204114.71482: variable '__network_service_name_default_nm' from source: role '' defaults 34589 1727204114.71486: variable '__network_packages_default_nm' from source: role '' defaults 34589 1727204114.71488: variable '__network_packages_default_nm' from source: role '' defaults 34589 1727204114.71858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204114.73991: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204114.74054: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204114.74110: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204114.74153: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204114.74212: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204114.74333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204114.74435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204114.74541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204114.74635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204114.74668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204114.74769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204114.75083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204114.75086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204114.75089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204114.75093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204114.75441: variable '__network_packages_default_gobject_packages' from source: role '' defaults 34589 1727204114.75621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204114.75847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204114.75851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204114.75854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204114.75856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204114.76041: variable 'ansible_python' from source: facts 34589 1727204114.76283: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 34589 1727204114.76481: variable '__network_wpa_supplicant_required' from source: role '' defaults 34589 1727204114.76484: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 34589 1727204114.76793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204114.76827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204114.76858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204114.76977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204114.76998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204114.77263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204114.77281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204114.77284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204114.77287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204114.77374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204114.77627: variable 'network_connections' from source: task vars 34589 1727204114.77639: variable 'interface' from source: set_fact 34589 1727204114.78026: variable 'interface' from source: set_fact 34589 1727204114.78029: variable 'interface' from source: set_fact 34589 1727204114.78382: variable 'interface' from source: set_fact 34589 1727204114.78385: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204114.78387: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204114.78419: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204114.78450: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204114.78501: variable '__network_wireless_connections_defined' from source: role '' defaults 34589 1727204114.78980: variable 'network_connections' from source: task vars 34589 1727204114.79148: variable 'interface' from source: set_fact 34589 1727204114.79284: variable 'interface' from source: set_fact 34589 1727204114.79299: variable 'interface' from source: set_fact 34589 1727204114.79513: variable 'interface' from source: set_fact 34589 1727204114.79570: variable '__network_packages_default_wireless' from source: role '' defaults 34589 1727204114.79759: variable '__network_wireless_connections_defined' from source: role '' defaults 34589 1727204114.80482: variable 'network_connections' from source: task vars 34589 1727204114.80494: variable 'interface' from source: set_fact 34589 1727204114.80777: variable 'interface' from source: set_fact 34589 1727204114.80781: variable 'interface' from source: set_fact 34589 1727204114.80784: variable 'interface' from source: set_fact 34589 1727204114.80786: variable '__network_packages_default_team' from source: role '' defaults 34589 1727204114.80948: variable '__network_team_connections_defined' from source: role '' defaults 34589 1727204114.81563: variable 'network_connections' from source: task vars 34589 1727204114.81574: variable 'interface' from source: set_fact 34589 1727204114.81709: variable 'interface' from source: set_fact 34589 1727204114.81763: variable 'interface' from source: set_fact 34589 1727204114.81830: variable 'interface' from source: set_fact 34589 1727204114.82037: variable '__network_service_name_default_initscripts' from source: role '' defaults 34589 1727204114.82301: variable '__network_service_name_default_initscripts' from source: role '' defaults 34589 1727204114.82304: variable '__network_packages_default_initscripts' from source: role '' defaults 34589 1727204114.82307: variable '__network_packages_default_initscripts' from source: role '' defaults 34589 1727204114.82766: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 34589 1727204114.83762: variable 'network_connections' from source: task vars 34589 1727204114.83824: variable 'interface' from source: set_fact 34589 1727204114.83885: variable 'interface' from source: set_fact 34589 1727204114.83933: variable 'interface' from source: set_fact 34589 1727204114.84283: variable 'interface' from source: set_fact 34589 1727204114.84286: variable 'ansible_distribution' from source: facts 34589 1727204114.84288: variable '__network_rh_distros' from source: role '' defaults 34589 1727204114.84290: variable 'ansible_distribution_major_version' from source: facts 34589 1727204114.84291: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 34589 1727204114.84444: variable 'ansible_distribution' from source: facts 34589 1727204114.84506: variable '__network_rh_distros' from source: role '' defaults 34589 1727204114.84517: variable 'ansible_distribution_major_version' from source: facts 34589 1727204114.84534: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 34589 1727204114.84811: variable 'ansible_distribution' from source: facts 34589 1727204114.84941: variable '__network_rh_distros' from source: role '' defaults 34589 1727204114.84952: variable 'ansible_distribution_major_version' from source: facts 34589 1727204114.84995: variable 'network_provider' from source: set_fact 34589 1727204114.85017: variable 'ansible_facts' from source: unknown 34589 1727204114.86425: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 34589 1727204114.86434: when evaluation is False, skipping this task 34589 1727204114.86442: _execute() done 34589 1727204114.86450: dumping result to json 34589 1727204114.86462: done dumping result, returning 34589 1727204114.86679: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [028d2410-947f-a9c6-cddc-00000000001f] 34589 1727204114.86682: sending task result for task 028d2410-947f-a9c6-cddc-00000000001f 34589 1727204114.86750: done sending task result for task 028d2410-947f-a9c6-cddc-00000000001f 34589 1727204114.86753: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 34589 1727204114.86804: no more pending results, returning what we have 34589 1727204114.86809: results queue empty 34589 1727204114.86810: checking for any_errors_fatal 34589 1727204114.86818: done checking for any_errors_fatal 34589 1727204114.86819: checking for max_fail_percentage 34589 1727204114.86820: done checking for max_fail_percentage 34589 1727204114.86821: checking to see if all hosts have failed and the running result is not ok 34589 1727204114.86822: done checking to see if all hosts have failed 34589 1727204114.86822: getting the remaining hosts for this loop 34589 1727204114.86823: done getting the remaining hosts for this loop 34589 1727204114.86827: getting the next task for host managed-node1 34589 1727204114.86833: done getting next task for host managed-node1 34589 1727204114.86837: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34589 1727204114.86839: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204114.86856: getting variables 34589 1727204114.86857: in VariableManager get_vars() 34589 1727204114.86896: Calling all_inventory to load vars for managed-node1 34589 1727204114.86900: Calling groups_inventory to load vars for managed-node1 34589 1727204114.86902: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204114.86915: Calling all_plugins_play to load vars for managed-node1 34589 1727204114.86918: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204114.86922: Calling groups_plugins_play to load vars for managed-node1 34589 1727204114.89811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204114.93284: done with get_vars() 34589 1727204114.93315: done getting variables 34589 1727204114.93487: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:55:14 -0400 (0:00:00.255) 0:00:15.070 ***** 34589 1727204114.93524: entering _queue_task() for managed-node1/package 34589 1727204114.94468: worker is 1 (out of 1 available) 34589 1727204114.94481: exiting _queue_task() for managed-node1/package 34589 1727204114.94493: done queuing things up, now waiting for results queue to drain 34589 1727204114.94494: waiting for pending results... 34589 1727204114.94874: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34589 1727204114.95053: in run() - task 028d2410-947f-a9c6-cddc-000000000020 34589 1727204114.95058: variable 'ansible_search_path' from source: unknown 34589 1727204114.95161: variable 'ansible_search_path' from source: unknown 34589 1727204114.95165: calling self._execute() 34589 1727204114.95269: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204114.95273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204114.95277: variable 'omit' from source: magic vars 34589 1727204114.95630: variable 'ansible_distribution_major_version' from source: facts 34589 1727204114.95648: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204114.95759: variable 'network_state' from source: role '' defaults 34589 1727204114.95772: Evaluated conditional (network_state != {}): False 34589 1727204114.95781: when evaluation is False, skipping this task 34589 1727204114.95789: _execute() done 34589 1727204114.95794: dumping result to json 34589 1727204114.95804: done dumping result, returning 34589 1727204114.95815: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [028d2410-947f-a9c6-cddc-000000000020] 34589 1727204114.95825: sending task result for task 028d2410-947f-a9c6-cddc-000000000020 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34589 1727204114.96082: no more pending results, returning what we have 34589 1727204114.96086: results queue empty 34589 1727204114.96086: checking for any_errors_fatal 34589 1727204114.96095: done checking for any_errors_fatal 34589 1727204114.96095: checking for max_fail_percentage 34589 1727204114.96097: done checking for max_fail_percentage 34589 1727204114.96098: checking to see if all hosts have failed and the running result is not ok 34589 1727204114.96099: done checking to see if all hosts have failed 34589 1727204114.96099: getting the remaining hosts for this loop 34589 1727204114.96100: done getting the remaining hosts for this loop 34589 1727204114.96104: getting the next task for host managed-node1 34589 1727204114.96113: done getting next task for host managed-node1 34589 1727204114.96116: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34589 1727204114.96118: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204114.96133: getting variables 34589 1727204114.96134: in VariableManager get_vars() 34589 1727204114.96166: Calling all_inventory to load vars for managed-node1 34589 1727204114.96168: Calling groups_inventory to load vars for managed-node1 34589 1727204114.96170: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204114.96181: Calling all_plugins_play to load vars for managed-node1 34589 1727204114.96183: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204114.96187: Calling groups_plugins_play to load vars for managed-node1 34589 1727204114.96719: done sending task result for task 028d2410-947f-a9c6-cddc-000000000020 34589 1727204114.96722: WORKER PROCESS EXITING 34589 1727204114.97923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204114.99822: done with get_vars() 34589 1727204114.99847: done getting variables 34589 1727204114.99999: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:55:14 -0400 (0:00:00.065) 0:00:15.135 ***** 34589 1727204115.00033: entering _queue_task() for managed-node1/package 34589 1727204115.00600: worker is 1 (out of 1 available) 34589 1727204115.00611: exiting _queue_task() for managed-node1/package 34589 1727204115.00623: done queuing things up, now waiting for results queue to drain 34589 1727204115.00625: waiting for pending results... 34589 1727204115.00769: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34589 1727204115.00907: in run() - task 028d2410-947f-a9c6-cddc-000000000021 34589 1727204115.00927: variable 'ansible_search_path' from source: unknown 34589 1727204115.00933: variable 'ansible_search_path' from source: unknown 34589 1727204115.00978: calling self._execute() 34589 1727204115.01089: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204115.01100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204115.01112: variable 'omit' from source: magic vars 34589 1727204115.01491: variable 'ansible_distribution_major_version' from source: facts 34589 1727204115.01518: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204115.01653: variable 'network_state' from source: role '' defaults 34589 1727204115.01668: Evaluated conditional (network_state != {}): False 34589 1727204115.01677: when evaluation is False, skipping this task 34589 1727204115.01719: _execute() done 34589 1727204115.01722: dumping result to json 34589 1727204115.01725: done dumping result, returning 34589 1727204115.01728: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [028d2410-947f-a9c6-cddc-000000000021] 34589 1727204115.01730: sending task result for task 028d2410-947f-a9c6-cddc-000000000021 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34589 1727204115.02003: no more pending results, returning what we have 34589 1727204115.02008: results queue empty 34589 1727204115.02009: checking for any_errors_fatal 34589 1727204115.02016: done checking for any_errors_fatal 34589 1727204115.02017: checking for max_fail_percentage 34589 1727204115.02019: done checking for max_fail_percentage 34589 1727204115.02019: checking to see if all hosts have failed and the running result is not ok 34589 1727204115.02020: done checking to see if all hosts have failed 34589 1727204115.02021: getting the remaining hosts for this loop 34589 1727204115.02022: done getting the remaining hosts for this loop 34589 1727204115.02026: getting the next task for host managed-node1 34589 1727204115.02034: done getting next task for host managed-node1 34589 1727204115.02152: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34589 1727204115.02156: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204115.02174: getting variables 34589 1727204115.02178: in VariableManager get_vars() 34589 1727204115.02221: Calling all_inventory to load vars for managed-node1 34589 1727204115.02224: Calling groups_inventory to load vars for managed-node1 34589 1727204115.02227: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204115.02239: Calling all_plugins_play to load vars for managed-node1 34589 1727204115.02242: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204115.02245: Calling groups_plugins_play to load vars for managed-node1 34589 1727204115.02793: done sending task result for task 028d2410-947f-a9c6-cddc-000000000021 34589 1727204115.02797: WORKER PROCESS EXITING 34589 1727204115.03880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204115.05593: done with get_vars() 34589 1727204115.05628: done getting variables 34589 1727204115.05798: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:55:15 -0400 (0:00:00.058) 0:00:15.194 ***** 34589 1727204115.05939: entering _queue_task() for managed-node1/service 34589 1727204115.05941: Creating lock for service 34589 1727204115.06436: worker is 1 (out of 1 available) 34589 1727204115.06449: exiting _queue_task() for managed-node1/service 34589 1727204115.06463: done queuing things up, now waiting for results queue to drain 34589 1727204115.06464: waiting for pending results... 34589 1727204115.06994: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34589 1727204115.07400: in run() - task 028d2410-947f-a9c6-cddc-000000000022 34589 1727204115.07445: variable 'ansible_search_path' from source: unknown 34589 1727204115.07455: variable 'ansible_search_path' from source: unknown 34589 1727204115.07501: calling self._execute() 34589 1727204115.07703: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204115.07719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204115.07734: variable 'omit' from source: magic vars 34589 1727204115.08361: variable 'ansible_distribution_major_version' from source: facts 34589 1727204115.08488: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204115.08567: variable '__network_wireless_connections_defined' from source: role '' defaults 34589 1727204115.08816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204115.12948: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204115.13117: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204115.13246: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204115.13294: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204115.13354: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204115.13432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204115.13475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204115.13572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204115.13579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204115.13582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204115.13721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204115.13835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204115.13839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204115.13872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204115.13901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204115.13955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204115.13987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204115.14025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204115.14072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204115.14095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204115.14410: variable 'network_connections' from source: task vars 34589 1727204115.14444: variable 'interface' from source: set_fact 34589 1727204115.14553: variable 'interface' from source: set_fact 34589 1727204115.14556: variable 'interface' from source: set_fact 34589 1727204115.14611: variable 'interface' from source: set_fact 34589 1727204115.14699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204115.14991: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204115.15044: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204115.15091: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204115.15148: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204115.15211: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204115.15243: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204115.15281: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204115.15308: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204115.15427: variable '__network_team_connections_defined' from source: role '' defaults 34589 1727204115.15915: variable 'network_connections' from source: task vars 34589 1727204115.15928: variable 'interface' from source: set_fact 34589 1727204115.16001: variable 'interface' from source: set_fact 34589 1727204115.16017: variable 'interface' from source: set_fact 34589 1727204115.16079: variable 'interface' from source: set_fact 34589 1727204115.16164: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 34589 1727204115.16216: when evaluation is False, skipping this task 34589 1727204115.16381: _execute() done 34589 1727204115.16385: dumping result to json 34589 1727204115.16387: done dumping result, returning 34589 1727204115.16389: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [028d2410-947f-a9c6-cddc-000000000022] 34589 1727204115.16399: sending task result for task 028d2410-947f-a9c6-cddc-000000000022 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 34589 1727204115.16636: no more pending results, returning what we have 34589 1727204115.16639: results queue empty 34589 1727204115.16640: checking for any_errors_fatal 34589 1727204115.16649: done checking for any_errors_fatal 34589 1727204115.16650: checking for max_fail_percentage 34589 1727204115.16652: done checking for max_fail_percentage 34589 1727204115.16652: checking to see if all hosts have failed and the running result is not ok 34589 1727204115.16653: done checking to see if all hosts have failed 34589 1727204115.16654: getting the remaining hosts for this loop 34589 1727204115.16655: done getting the remaining hosts for this loop 34589 1727204115.16659: getting the next task for host managed-node1 34589 1727204115.16667: done getting next task for host managed-node1 34589 1727204115.16671: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34589 1727204115.16674: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204115.16693: getting variables 34589 1727204115.16694: in VariableManager get_vars() 34589 1727204115.16736: Calling all_inventory to load vars for managed-node1 34589 1727204115.16740: Calling groups_inventory to load vars for managed-node1 34589 1727204115.16742: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204115.16752: Calling all_plugins_play to load vars for managed-node1 34589 1727204115.16755: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204115.16758: Calling groups_plugins_play to load vars for managed-node1 34589 1727204115.17300: done sending task result for task 028d2410-947f-a9c6-cddc-000000000022 34589 1727204115.17304: WORKER PROCESS EXITING 34589 1727204115.20014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204115.24082: done with get_vars() 34589 1727204115.24227: done getting variables 34589 1727204115.24295: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:55:15 -0400 (0:00:00.186) 0:00:15.380 ***** 34589 1727204115.24564: entering _queue_task() for managed-node1/service 34589 1727204115.25218: worker is 1 (out of 1 available) 34589 1727204115.25231: exiting _queue_task() for managed-node1/service 34589 1727204115.25243: done queuing things up, now waiting for results queue to drain 34589 1727204115.25244: waiting for pending results... 34589 1727204115.25524: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34589 1727204115.25783: in run() - task 028d2410-947f-a9c6-cddc-000000000023 34589 1727204115.25788: variable 'ansible_search_path' from source: unknown 34589 1727204115.25790: variable 'ansible_search_path' from source: unknown 34589 1727204115.25793: calling self._execute() 34589 1727204115.26083: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204115.26086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204115.26088: variable 'omit' from source: magic vars 34589 1727204115.26659: variable 'ansible_distribution_major_version' from source: facts 34589 1727204115.26674: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204115.27582: variable 'network_provider' from source: set_fact 34589 1727204115.27585: variable 'network_state' from source: role '' defaults 34589 1727204115.27588: Evaluated conditional (network_provider == "nm" or network_state != {}): True 34589 1727204115.27591: variable 'omit' from source: magic vars 34589 1727204115.27593: variable 'omit' from source: magic vars 34589 1727204115.27596: variable 'network_service_name' from source: role '' defaults 34589 1727204115.27882: variable 'network_service_name' from source: role '' defaults 34589 1727204115.27980: variable '__network_provider_setup' from source: role '' defaults 34589 1727204115.28251: variable '__network_service_name_default_nm' from source: role '' defaults 34589 1727204115.28481: variable '__network_service_name_default_nm' from source: role '' defaults 34589 1727204115.28484: variable '__network_packages_default_nm' from source: role '' defaults 34589 1727204115.28629: variable '__network_packages_default_nm' from source: role '' defaults 34589 1727204115.29323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204115.33505: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204115.33758: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204115.33882: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204115.33894: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204115.33980: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204115.34158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204115.34195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204115.34225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204115.34264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204115.34284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204115.34335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204115.34581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204115.34584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204115.34587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204115.34589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204115.34889: variable '__network_packages_default_gobject_packages' from source: role '' defaults 34589 1727204115.35130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204115.35269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204115.35369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204115.35417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204115.35431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204115.35642: variable 'ansible_python' from source: facts 34589 1727204115.35664: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 34589 1727204115.35928: variable '__network_wpa_supplicant_required' from source: role '' defaults 34589 1727204115.36121: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 34589 1727204115.36360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204115.36391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204115.36418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204115.36574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204115.36594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204115.36642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204115.36781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204115.36813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204115.36849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204115.36864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204115.37179: variable 'network_connections' from source: task vars 34589 1727204115.37187: variable 'interface' from source: set_fact 34589 1727204115.37379: variable 'interface' from source: set_fact 34589 1727204115.37391: variable 'interface' from source: set_fact 34589 1727204115.37577: variable 'interface' from source: set_fact 34589 1727204115.37688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204115.38282: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204115.38286: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204115.38396: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204115.38555: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204115.38621: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204115.38701: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204115.38737: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204115.38887: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204115.38937: variable '__network_wireless_connections_defined' from source: role '' defaults 34589 1727204115.39558: variable 'network_connections' from source: task vars 34589 1727204115.39564: variable 'interface' from source: set_fact 34589 1727204115.39724: variable 'interface' from source: set_fact 34589 1727204115.39738: variable 'interface' from source: set_fact 34589 1727204115.39815: variable 'interface' from source: set_fact 34589 1727204115.39869: variable '__network_packages_default_wireless' from source: role '' defaults 34589 1727204115.39982: variable '__network_wireless_connections_defined' from source: role '' defaults 34589 1727204115.40274: variable 'network_connections' from source: task vars 34589 1727204115.40279: variable 'interface' from source: set_fact 34589 1727204115.40350: variable 'interface' from source: set_fact 34589 1727204115.40370: variable 'interface' from source: set_fact 34589 1727204115.40483: variable 'interface' from source: set_fact 34589 1727204115.40486: variable '__network_packages_default_team' from source: role '' defaults 34589 1727204115.40535: variable '__network_team_connections_defined' from source: role '' defaults 34589 1727204115.41147: variable 'network_connections' from source: task vars 34589 1727204115.41150: variable 'interface' from source: set_fact 34589 1727204115.41224: variable 'interface' from source: set_fact 34589 1727204115.41236: variable 'interface' from source: set_fact 34589 1727204115.41429: variable 'interface' from source: set_fact 34589 1727204115.42085: variable '__network_service_name_default_initscripts' from source: role '' defaults 34589 1727204115.42089: variable '__network_service_name_default_initscripts' from source: role '' defaults 34589 1727204115.42091: variable '__network_packages_default_initscripts' from source: role '' defaults 34589 1727204115.42093: variable '__network_packages_default_initscripts' from source: role '' defaults 34589 1727204115.42572: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 34589 1727204115.43761: variable 'network_connections' from source: task vars 34589 1727204115.43764: variable 'interface' from source: set_fact 34589 1727204115.43828: variable 'interface' from source: set_fact 34589 1727204115.43835: variable 'interface' from source: set_fact 34589 1727204115.44004: variable 'interface' from source: set_fact 34589 1727204115.44016: variable 'ansible_distribution' from source: facts 34589 1727204115.44019: variable '__network_rh_distros' from source: role '' defaults 34589 1727204115.44025: variable 'ansible_distribution_major_version' from source: facts 34589 1727204115.44048: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 34589 1727204115.44480: variable 'ansible_distribution' from source: facts 34589 1727204115.44483: variable '__network_rh_distros' from source: role '' defaults 34589 1727204115.44489: variable 'ansible_distribution_major_version' from source: facts 34589 1727204115.44491: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 34589 1727204115.44780: variable 'ansible_distribution' from source: facts 34589 1727204115.44786: variable '__network_rh_distros' from source: role '' defaults 34589 1727204115.44788: variable 'ansible_distribution_major_version' from source: facts 34589 1727204115.44790: variable 'network_provider' from source: set_fact 34589 1727204115.44951: variable 'omit' from source: magic vars 34589 1727204115.44979: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204115.45007: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204115.45030: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204115.45104: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204115.45117: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204115.45202: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204115.45205: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204115.45208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204115.45295: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204115.45298: Set connection var ansible_shell_executable to /bin/sh 34589 1727204115.45307: Set connection var ansible_timeout to 10 34589 1727204115.45314: Set connection var ansible_shell_type to sh 34589 1727204115.45367: Set connection var ansible_connection to ssh 34589 1727204115.45371: Set connection var ansible_pipelining to False 34589 1727204115.45373: variable 'ansible_shell_executable' from source: unknown 34589 1727204115.45377: variable 'ansible_connection' from source: unknown 34589 1727204115.45380: variable 'ansible_module_compression' from source: unknown 34589 1727204115.45382: variable 'ansible_shell_type' from source: unknown 34589 1727204115.45384: variable 'ansible_shell_executable' from source: unknown 34589 1727204115.45386: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204115.45392: variable 'ansible_pipelining' from source: unknown 34589 1727204115.45394: variable 'ansible_timeout' from source: unknown 34589 1727204115.45396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204115.45591: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204115.45594: variable 'omit' from source: magic vars 34589 1727204115.45597: starting attempt loop 34589 1727204115.45599: running the handler 34589 1727204115.45601: variable 'ansible_facts' from source: unknown 34589 1727204115.47147: _low_level_execute_command(): starting 34589 1727204115.47151: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204115.47898: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204115.47981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204115.48000: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204115.48116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204115.50023: stdout chunk (state=3): >>>/root <<< 34589 1727204115.50026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204115.50096: stderr chunk (state=3): >>><<< 34589 1727204115.50099: stdout chunk (state=3): >>><<< 34589 1727204115.50384: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204115.50388: _low_level_execute_command(): starting 34589 1727204115.50390: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204115.5012043-36325-26766446238271 `" && echo ansible-tmp-1727204115.5012043-36325-26766446238271="` echo /root/.ansible/tmp/ansible-tmp-1727204115.5012043-36325-26766446238271 `" ) && sleep 0' 34589 1727204115.51401: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204115.51404: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204115.51409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204115.51411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204115.51413: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204115.51415: stderr chunk (state=3): >>>debug2: match not found <<< 34589 1727204115.51417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204115.51419: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34589 1727204115.51441: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204115.51726: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204115.54085: stdout chunk (state=3): >>>ansible-tmp-1727204115.5012043-36325-26766446238271=/root/.ansible/tmp/ansible-tmp-1727204115.5012043-36325-26766446238271 <<< 34589 1727204115.54089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204115.54091: stdout chunk (state=3): >>><<< 34589 1727204115.54094: stderr chunk (state=3): >>><<< 34589 1727204115.54097: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204115.5012043-36325-26766446238271=/root/.ansible/tmp/ansible-tmp-1727204115.5012043-36325-26766446238271 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204115.54099: variable 'ansible_module_compression' from source: unknown 34589 1727204115.54172: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 34589 1727204115.54184: ANSIBALLZ: Acquiring lock 34589 1727204115.54196: ANSIBALLZ: Lock acquired: 140222054199088 34589 1727204115.54205: ANSIBALLZ: Creating module 34589 1727204115.96059: ANSIBALLZ: Writing module into payload 34589 1727204115.96328: ANSIBALLZ: Writing module 34589 1727204115.96332: ANSIBALLZ: Renaming module 34589 1727204115.96334: ANSIBALLZ: Done creating module 34589 1727204115.96341: variable 'ansible_facts' from source: unknown 34589 1727204115.96558: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204115.5012043-36325-26766446238271/AnsiballZ_systemd.py 34589 1727204115.96790: Sending initial data 34589 1727204115.96793: Sent initial data (155 bytes) 34589 1727204115.97396: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204115.97492: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204115.97496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204115.97537: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204115.97560: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204115.97577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204115.97758: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204115.99445: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204115.99521: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204115.99628: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmp_w86jps1 /root/.ansible/tmp/ansible-tmp-1727204115.5012043-36325-26766446238271/AnsiballZ_systemd.py <<< 34589 1727204115.99631: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204115.5012043-36325-26766446238271/AnsiballZ_systemd.py" <<< 34589 1727204115.99692: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmp_w86jps1" to remote "/root/.ansible/tmp/ansible-tmp-1727204115.5012043-36325-26766446238271/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204115.5012043-36325-26766446238271/AnsiballZ_systemd.py" <<< 34589 1727204116.02661: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204116.02769: stdout chunk (state=3): >>><<< 34589 1727204116.02773: stderr chunk (state=3): >>><<< 34589 1727204116.02777: done transferring module to remote 34589 1727204116.02780: _low_level_execute_command(): starting 34589 1727204116.02782: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204115.5012043-36325-26766446238271/ /root/.ansible/tmp/ansible-tmp-1727204115.5012043-36325-26766446238271/AnsiballZ_systemd.py && sleep 0' 34589 1727204116.04020: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204116.04024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204116.04028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204116.04030: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204116.04189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204116.04216: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204116.04238: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204116.04255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204116.04356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204116.06494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204116.06599: stderr chunk (state=3): >>><<< 34589 1727204116.06774: stdout chunk (state=3): >>><<< 34589 1727204116.06783: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204116.06785: _low_level_execute_command(): starting 34589 1727204116.06788: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204115.5012043-36325-26766446238271/AnsiballZ_systemd.py && sleep 0' 34589 1727204116.07852: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204116.07866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34589 1727204116.07889: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204116.08549: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204116.08553: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204116.08793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204116.08986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204116.40392: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10702848", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3299500032", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1460490000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 34589 1727204116.40474: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 34589 1727204116.42723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204116.42730: stdout chunk (state=3): >>><<< 34589 1727204116.42733: stderr chunk (state=3): >>><<< 34589 1727204116.42749: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10702848", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3299500032", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1460490000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204116.43162: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204115.5012043-36325-26766446238271/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204116.43165: _low_level_execute_command(): starting 34589 1727204116.43167: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204115.5012043-36325-26766446238271/ > /dev/null 2>&1 && sleep 0' 34589 1727204116.44604: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204116.44626: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204116.44633: stderr chunk (state=3): >>>debug2: match not found <<< 34589 1727204116.44643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204116.44802: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204116.44830: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204116.44846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204116.44954: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204116.45108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204116.47139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204116.47143: stderr chunk (state=3): >>><<< 34589 1727204116.47237: stdout chunk (state=3): >>><<< 34589 1727204116.47253: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204116.47261: handler run complete 34589 1727204116.47325: attempt loop complete, returning result 34589 1727204116.47328: _execute() done 34589 1727204116.47386: dumping result to json 34589 1727204116.47409: done dumping result, returning 34589 1727204116.47417: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [028d2410-947f-a9c6-cddc-000000000023] 34589 1727204116.47419: sending task result for task 028d2410-947f-a9c6-cddc-000000000023 ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34589 1727204116.48238: no more pending results, returning what we have 34589 1727204116.48241: results queue empty 34589 1727204116.48242: checking for any_errors_fatal 34589 1727204116.48248: done checking for any_errors_fatal 34589 1727204116.48249: checking for max_fail_percentage 34589 1727204116.48250: done checking for max_fail_percentage 34589 1727204116.48251: checking to see if all hosts have failed and the running result is not ok 34589 1727204116.48251: done checking to see if all hosts have failed 34589 1727204116.48252: getting the remaining hosts for this loop 34589 1727204116.48253: done getting the remaining hosts for this loop 34589 1727204116.48257: getting the next task for host managed-node1 34589 1727204116.48263: done getting next task for host managed-node1 34589 1727204116.48266: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34589 1727204116.48269: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204116.48282: getting variables 34589 1727204116.48284: in VariableManager get_vars() 34589 1727204116.48329: Calling all_inventory to load vars for managed-node1 34589 1727204116.48332: Calling groups_inventory to load vars for managed-node1 34589 1727204116.48335: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204116.48345: Calling all_plugins_play to load vars for managed-node1 34589 1727204116.48348: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204116.48351: Calling groups_plugins_play to load vars for managed-node1 34589 1727204116.48982: done sending task result for task 028d2410-947f-a9c6-cddc-000000000023 34589 1727204116.48985: WORKER PROCESS EXITING 34589 1727204116.51426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204116.53285: done with get_vars() 34589 1727204116.53317: done getting variables 34589 1727204116.53391: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:55:16 -0400 (0:00:01.288) 0:00:16.669 ***** 34589 1727204116.53428: entering _queue_task() for managed-node1/service 34589 1727204116.53799: worker is 1 (out of 1 available) 34589 1727204116.53925: exiting _queue_task() for managed-node1/service 34589 1727204116.53938: done queuing things up, now waiting for results queue to drain 34589 1727204116.53940: waiting for pending results... 34589 1727204116.54138: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34589 1727204116.54314: in run() - task 028d2410-947f-a9c6-cddc-000000000024 34589 1727204116.54336: variable 'ansible_search_path' from source: unknown 34589 1727204116.54365: variable 'ansible_search_path' from source: unknown 34589 1727204116.54396: calling self._execute() 34589 1727204116.54496: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204116.54584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204116.54587: variable 'omit' from source: magic vars 34589 1727204116.55032: variable 'ansible_distribution_major_version' from source: facts 34589 1727204116.55235: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204116.55288: variable 'network_provider' from source: set_fact 34589 1727204116.55299: Evaluated conditional (network_provider == "nm"): True 34589 1727204116.55423: variable '__network_wpa_supplicant_required' from source: role '' defaults 34589 1727204116.55527: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 34589 1727204116.55713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204116.57995: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204116.58074: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204116.58123: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204116.58166: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204116.58202: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204116.58304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204116.58337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204116.58362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204116.58413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204116.58430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204116.58484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204116.58581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204116.58584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204116.58596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204116.58628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204116.58674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204116.58714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204116.58747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204116.58794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204116.58817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204116.59326: variable 'network_connections' from source: task vars 34589 1727204116.59434: variable 'interface' from source: set_fact 34589 1727204116.59438: variable 'interface' from source: set_fact 34589 1727204116.59450: variable 'interface' from source: set_fact 34589 1727204116.59522: variable 'interface' from source: set_fact 34589 1727204116.59727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204116.59935: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204116.59990: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204116.60055: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204116.60098: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204116.60150: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204116.60234: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204116.60335: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204116.60367: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204116.60534: variable '__network_wireless_connections_defined' from source: role '' defaults 34589 1727204116.61118: variable 'network_connections' from source: task vars 34589 1727204116.61280: variable 'interface' from source: set_fact 34589 1727204116.61283: variable 'interface' from source: set_fact 34589 1727204116.61287: variable 'interface' from source: set_fact 34589 1727204116.61585: variable 'interface' from source: set_fact 34589 1727204116.61588: Evaluated conditional (__network_wpa_supplicant_required): False 34589 1727204116.61590: when evaluation is False, skipping this task 34589 1727204116.61592: _execute() done 34589 1727204116.61601: dumping result to json 34589 1727204116.61603: done dumping result, returning 34589 1727204116.61622: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [028d2410-947f-a9c6-cddc-000000000024] 34589 1727204116.61664: sending task result for task 028d2410-947f-a9c6-cddc-000000000024 34589 1727204116.61964: done sending task result for task 028d2410-947f-a9c6-cddc-000000000024 34589 1727204116.61968: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 34589 1727204116.62024: no more pending results, returning what we have 34589 1727204116.62028: results queue empty 34589 1727204116.62031: checking for any_errors_fatal 34589 1727204116.62060: done checking for any_errors_fatal 34589 1727204116.62062: checking for max_fail_percentage 34589 1727204116.62064: done checking for max_fail_percentage 34589 1727204116.62065: checking to see if all hosts have failed and the running result is not ok 34589 1727204116.62066: done checking to see if all hosts have failed 34589 1727204116.62066: getting the remaining hosts for this loop 34589 1727204116.62067: done getting the remaining hosts for this loop 34589 1727204116.62072: getting the next task for host managed-node1 34589 1727204116.62081: done getting next task for host managed-node1 34589 1727204116.62085: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 34589 1727204116.62088: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204116.62103: getting variables 34589 1727204116.62105: in VariableManager get_vars() 34589 1727204116.62148: Calling all_inventory to load vars for managed-node1 34589 1727204116.62151: Calling groups_inventory to load vars for managed-node1 34589 1727204116.62154: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204116.62279: Calling all_plugins_play to load vars for managed-node1 34589 1727204116.62283: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204116.62287: Calling groups_plugins_play to load vars for managed-node1 34589 1727204116.63974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204116.66482: done with get_vars() 34589 1727204116.66515: done getting variables 34589 1727204116.66582: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:55:16 -0400 (0:00:00.131) 0:00:16.801 ***** 34589 1727204116.66657: entering _queue_task() for managed-node1/service 34589 1727204116.67121: worker is 1 (out of 1 available) 34589 1727204116.67245: exiting _queue_task() for managed-node1/service 34589 1727204116.67257: done queuing things up, now waiting for results queue to drain 34589 1727204116.67258: waiting for pending results... 34589 1727204116.67444: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 34589 1727204116.67675: in run() - task 028d2410-947f-a9c6-cddc-000000000025 34589 1727204116.67680: variable 'ansible_search_path' from source: unknown 34589 1727204116.67682: variable 'ansible_search_path' from source: unknown 34589 1727204116.67685: calling self._execute() 34589 1727204116.67756: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204116.67768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204116.67788: variable 'omit' from source: magic vars 34589 1727204116.68202: variable 'ansible_distribution_major_version' from source: facts 34589 1727204116.68227: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204116.68354: variable 'network_provider' from source: set_fact 34589 1727204116.68365: Evaluated conditional (network_provider == "initscripts"): False 34589 1727204116.68372: when evaluation is False, skipping this task 34589 1727204116.68382: _execute() done 34589 1727204116.68435: dumping result to json 34589 1727204116.68438: done dumping result, returning 34589 1727204116.68441: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [028d2410-947f-a9c6-cddc-000000000025] 34589 1727204116.68444: sending task result for task 028d2410-947f-a9c6-cddc-000000000025 skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34589 1727204116.68584: no more pending results, returning what we have 34589 1727204116.68588: results queue empty 34589 1727204116.68589: checking for any_errors_fatal 34589 1727204116.68601: done checking for any_errors_fatal 34589 1727204116.68602: checking for max_fail_percentage 34589 1727204116.68604: done checking for max_fail_percentage 34589 1727204116.68605: checking to see if all hosts have failed and the running result is not ok 34589 1727204116.68606: done checking to see if all hosts have failed 34589 1727204116.68609: getting the remaining hosts for this loop 34589 1727204116.68610: done getting the remaining hosts for this loop 34589 1727204116.68614: getting the next task for host managed-node1 34589 1727204116.68622: done getting next task for host managed-node1 34589 1727204116.68626: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34589 1727204116.68630: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204116.68647: getting variables 34589 1727204116.68649: in VariableManager get_vars() 34589 1727204116.68691: Calling all_inventory to load vars for managed-node1 34589 1727204116.68694: Calling groups_inventory to load vars for managed-node1 34589 1727204116.68697: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204116.68713: Calling all_plugins_play to load vars for managed-node1 34589 1727204116.68717: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204116.68720: Calling groups_plugins_play to load vars for managed-node1 34589 1727204116.69474: done sending task result for task 028d2410-947f-a9c6-cddc-000000000025 34589 1727204116.69481: WORKER PROCESS EXITING 34589 1727204116.71579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204116.73124: done with get_vars() 34589 1727204116.73158: done getting variables 34589 1727204116.73223: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:55:16 -0400 (0:00:00.066) 0:00:16.867 ***** 34589 1727204116.73262: entering _queue_task() for managed-node1/copy 34589 1727204116.73632: worker is 1 (out of 1 available) 34589 1727204116.73645: exiting _queue_task() for managed-node1/copy 34589 1727204116.73658: done queuing things up, now waiting for results queue to drain 34589 1727204116.73660: waiting for pending results... 34589 1727204116.73972: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34589 1727204116.74139: in run() - task 028d2410-947f-a9c6-cddc-000000000026 34589 1727204116.74159: variable 'ansible_search_path' from source: unknown 34589 1727204116.74167: variable 'ansible_search_path' from source: unknown 34589 1727204116.74213: calling self._execute() 34589 1727204116.74333: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204116.74346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204116.74360: variable 'omit' from source: magic vars 34589 1727204116.74745: variable 'ansible_distribution_major_version' from source: facts 34589 1727204116.74769: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204116.74890: variable 'network_provider' from source: set_fact 34589 1727204116.74900: Evaluated conditional (network_provider == "initscripts"): False 34589 1727204116.74985: when evaluation is False, skipping this task 34589 1727204116.74988: _execute() done 34589 1727204116.74991: dumping result to json 34589 1727204116.74993: done dumping result, returning 34589 1727204116.74996: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [028d2410-947f-a9c6-cddc-000000000026] 34589 1727204116.74998: sending task result for task 028d2410-947f-a9c6-cddc-000000000026 34589 1727204116.75073: done sending task result for task 028d2410-947f-a9c6-cddc-000000000026 34589 1727204116.75078: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 34589 1727204116.75141: no more pending results, returning what we have 34589 1727204116.75146: results queue empty 34589 1727204116.75146: checking for any_errors_fatal 34589 1727204116.75155: done checking for any_errors_fatal 34589 1727204116.75156: checking for max_fail_percentage 34589 1727204116.75157: done checking for max_fail_percentage 34589 1727204116.75158: checking to see if all hosts have failed and the running result is not ok 34589 1727204116.75159: done checking to see if all hosts have failed 34589 1727204116.75159: getting the remaining hosts for this loop 34589 1727204116.75161: done getting the remaining hosts for this loop 34589 1727204116.75164: getting the next task for host managed-node1 34589 1727204116.75172: done getting next task for host managed-node1 34589 1727204116.75177: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34589 1727204116.75180: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204116.75282: getting variables 34589 1727204116.75284: in VariableManager get_vars() 34589 1727204116.75329: Calling all_inventory to load vars for managed-node1 34589 1727204116.75332: Calling groups_inventory to load vars for managed-node1 34589 1727204116.75335: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204116.75347: Calling all_plugins_play to load vars for managed-node1 34589 1727204116.75350: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204116.75353: Calling groups_plugins_play to load vars for managed-node1 34589 1727204116.76954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204116.78582: done with get_vars() 34589 1727204116.78614: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:55:16 -0400 (0:00:00.054) 0:00:16.922 ***** 34589 1727204116.78713: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 34589 1727204116.78714: Creating lock for fedora.linux_system_roles.network_connections 34589 1727204116.79290: worker is 1 (out of 1 available) 34589 1727204116.79302: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 34589 1727204116.79314: done queuing things up, now waiting for results queue to drain 34589 1727204116.79316: waiting for pending results... 34589 1727204116.79448: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34589 1727204116.79653: in run() - task 028d2410-947f-a9c6-cddc-000000000027 34589 1727204116.79657: variable 'ansible_search_path' from source: unknown 34589 1727204116.79659: variable 'ansible_search_path' from source: unknown 34589 1727204116.79661: calling self._execute() 34589 1727204116.79726: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204116.79735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204116.79746: variable 'omit' from source: magic vars 34589 1727204116.80112: variable 'ansible_distribution_major_version' from source: facts 34589 1727204116.80130: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204116.80141: variable 'omit' from source: magic vars 34589 1727204116.80212: variable 'omit' from source: magic vars 34589 1727204116.80386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204116.83020: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204116.83124: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204116.83257: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204116.83261: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204116.83263: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204116.83352: variable 'network_provider' from source: set_fact 34589 1727204116.83516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204116.83551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204116.83591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204116.83642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204116.83694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204116.83753: variable 'omit' from source: magic vars 34589 1727204116.83886: variable 'omit' from source: magic vars 34589 1727204116.84033: variable 'network_connections' from source: task vars 34589 1727204116.84181: variable 'interface' from source: set_fact 34589 1727204116.84184: variable 'interface' from source: set_fact 34589 1727204116.84187: variable 'interface' from source: set_fact 34589 1727204116.84201: variable 'interface' from source: set_fact 34589 1727204116.84387: variable 'omit' from source: magic vars 34589 1727204116.84400: variable '__lsr_ansible_managed' from source: task vars 34589 1727204116.84473: variable '__lsr_ansible_managed' from source: task vars 34589 1727204116.84770: Loaded config def from plugin (lookup/template) 34589 1727204116.84783: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 34589 1727204116.84819: File lookup term: get_ansible_managed.j2 34589 1727204116.84828: variable 'ansible_search_path' from source: unknown 34589 1727204116.84837: evaluation_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 34589 1727204116.84863: search_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 34589 1727204116.84889: variable 'ansible_search_path' from source: unknown 34589 1727204116.89437: variable 'ansible_managed' from source: unknown 34589 1727204116.89512: variable 'omit' from source: magic vars 34589 1727204116.89531: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204116.89556: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204116.89569: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204116.89610: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204116.89613: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204116.89626: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204116.89630: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204116.89632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204116.89698: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204116.89701: Set connection var ansible_shell_executable to /bin/sh 34589 1727204116.89711: Set connection var ansible_timeout to 10 34589 1727204116.89714: Set connection var ansible_shell_type to sh 34589 1727204116.89716: Set connection var ansible_connection to ssh 34589 1727204116.89722: Set connection var ansible_pipelining to False 34589 1727204116.89739: variable 'ansible_shell_executable' from source: unknown 34589 1727204116.89741: variable 'ansible_connection' from source: unknown 34589 1727204116.89745: variable 'ansible_module_compression' from source: unknown 34589 1727204116.89747: variable 'ansible_shell_type' from source: unknown 34589 1727204116.89749: variable 'ansible_shell_executable' from source: unknown 34589 1727204116.89751: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204116.89755: variable 'ansible_pipelining' from source: unknown 34589 1727204116.89758: variable 'ansible_timeout' from source: unknown 34589 1727204116.89761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204116.89986: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34589 1727204116.89997: variable 'omit' from source: magic vars 34589 1727204116.89999: starting attempt loop 34589 1727204116.90002: running the handler 34589 1727204116.90004: _low_level_execute_command(): starting 34589 1727204116.90009: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204116.90438: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204116.90455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204116.90467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204116.90513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204116.90519: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204116.90533: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204116.90631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204116.92415: stdout chunk (state=3): >>>/root <<< 34589 1727204116.92516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204116.92549: stderr chunk (state=3): >>><<< 34589 1727204116.92551: stdout chunk (state=3): >>><<< 34589 1727204116.92565: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204116.92586: _low_level_execute_command(): starting 34589 1727204116.92590: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204116.925728-36475-218097996171820 `" && echo ansible-tmp-1727204116.925728-36475-218097996171820="` echo /root/.ansible/tmp/ansible-tmp-1727204116.925728-36475-218097996171820 `" ) && sleep 0' 34589 1727204116.93015: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204116.93018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204116.93021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204116.93023: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204116.93025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204116.93069: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204116.93077: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204116.93080: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204116.93155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204116.95229: stdout chunk (state=3): >>>ansible-tmp-1727204116.925728-36475-218097996171820=/root/.ansible/tmp/ansible-tmp-1727204116.925728-36475-218097996171820 <<< 34589 1727204116.95339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204116.95365: stderr chunk (state=3): >>><<< 34589 1727204116.95368: stdout chunk (state=3): >>><<< 34589 1727204116.95385: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204116.925728-36475-218097996171820=/root/.ansible/tmp/ansible-tmp-1727204116.925728-36475-218097996171820 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204116.95420: variable 'ansible_module_compression' from source: unknown 34589 1727204116.95454: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 34589 1727204116.95457: ANSIBALLZ: Acquiring lock 34589 1727204116.95460: ANSIBALLZ: Lock acquired: 140222010805232 34589 1727204116.95462: ANSIBALLZ: Creating module 34589 1727204117.07988: ANSIBALLZ: Writing module into payload 34589 1727204117.08210: ANSIBALLZ: Writing module 34589 1727204117.08226: ANSIBALLZ: Renaming module 34589 1727204117.08234: ANSIBALLZ: Done creating module 34589 1727204117.08255: variable 'ansible_facts' from source: unknown 34589 1727204117.08321: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204116.925728-36475-218097996171820/AnsiballZ_network_connections.py 34589 1727204117.08422: Sending initial data 34589 1727204117.08425: Sent initial data (167 bytes) 34589 1727204117.08852: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204117.08859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204117.08890: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204117.08894: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204117.08896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204117.08898: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204117.08954: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204117.08957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204117.08959: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204117.09046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204117.10765: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 34589 1727204117.10769: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204117.10837: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204117.10917: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmp6rsjso2m /root/.ansible/tmp/ansible-tmp-1727204116.925728-36475-218097996171820/AnsiballZ_network_connections.py <<< 34589 1727204117.10920: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204116.925728-36475-218097996171820/AnsiballZ_network_connections.py" <<< 34589 1727204117.10992: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmp6rsjso2m" to remote "/root/.ansible/tmp/ansible-tmp-1727204116.925728-36475-218097996171820/AnsiballZ_network_connections.py" <<< 34589 1727204117.10995: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204116.925728-36475-218097996171820/AnsiballZ_network_connections.py" <<< 34589 1727204117.11833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204117.11874: stderr chunk (state=3): >>><<< 34589 1727204117.11879: stdout chunk (state=3): >>><<< 34589 1727204117.11910: done transferring module to remote 34589 1727204117.11925: _low_level_execute_command(): starting 34589 1727204117.11928: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204116.925728-36475-218097996171820/ /root/.ansible/tmp/ansible-tmp-1727204116.925728-36475-218097996171820/AnsiballZ_network_connections.py && sleep 0' 34589 1727204117.12343: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204117.12378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204117.12381: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204117.12384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204117.12386: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204117.12388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204117.12437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204117.12443: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204117.12445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204117.12522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204117.14705: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204117.14712: stderr chunk (state=3): >>><<< 34589 1727204117.14715: stdout chunk (state=3): >>><<< 34589 1727204117.14718: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204117.14720: _low_level_execute_command(): starting 34589 1727204117.14722: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204116.925728-36475-218097996171820/AnsiballZ_network_connections.py && sleep 0' 34589 1727204117.15283: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204117.15334: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204117.15419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204117.43990: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 0a64492c-4969-466c-9920-91c73029e796\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "type": "ethernet", "ip": {"ipv6_disabled": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "type": "ethernet", "ip": {"ipv6_disabled": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 34589 1727204117.46155: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204117.46159: stdout chunk (state=3): >>><<< 34589 1727204117.46162: stderr chunk (state=3): >>><<< 34589 1727204117.46325: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 0a64492c-4969-466c-9920-91c73029e796\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "type": "ethernet", "ip": {"ipv6_disabled": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "type": "ethernet", "ip": {"ipv6_disabled": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204117.46329: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'interface_name': 'ethtest0', 'type': 'ethernet', 'ip': {'ipv6_disabled': True}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204116.925728-36475-218097996171820/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204117.46332: _low_level_execute_command(): starting 34589 1727204117.46334: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204116.925728-36475-218097996171820/ > /dev/null 2>&1 && sleep 0' 34589 1727204117.46998: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204117.47046: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204117.47070: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204117.47193: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204117.49330: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204117.49404: stderr chunk (state=3): >>><<< 34589 1727204117.49414: stdout chunk (state=3): >>><<< 34589 1727204117.49539: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204117.49542: handler run complete 34589 1727204117.49801: attempt loop complete, returning result 34589 1727204117.49805: _execute() done 34589 1727204117.49807: dumping result to json 34589 1727204117.49809: done dumping result, returning 34589 1727204117.49811: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [028d2410-947f-a9c6-cddc-000000000027] 34589 1727204117.49813: sending task result for task 028d2410-947f-a9c6-cddc-000000000027 34589 1727204117.49887: done sending task result for task 028d2410-947f-a9c6-cddc-000000000027 34589 1727204117.49891: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "ethtest0", "ip": { "ipv6_disabled": true }, "name": "ethtest0", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 0a64492c-4969-466c-9920-91c73029e796 34589 1727204117.50001: no more pending results, returning what we have 34589 1727204117.50007: results queue empty 34589 1727204117.50008: checking for any_errors_fatal 34589 1727204117.50017: done checking for any_errors_fatal 34589 1727204117.50018: checking for max_fail_percentage 34589 1727204117.50020: done checking for max_fail_percentage 34589 1727204117.50020: checking to see if all hosts have failed and the running result is not ok 34589 1727204117.50021: done checking to see if all hosts have failed 34589 1727204117.50022: getting the remaining hosts for this loop 34589 1727204117.50023: done getting the remaining hosts for this loop 34589 1727204117.50027: getting the next task for host managed-node1 34589 1727204117.50035: done getting next task for host managed-node1 34589 1727204117.50039: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 34589 1727204117.50042: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204117.50056: getting variables 34589 1727204117.50058: in VariableManager get_vars() 34589 1727204117.50327: Calling all_inventory to load vars for managed-node1 34589 1727204117.50331: Calling groups_inventory to load vars for managed-node1 34589 1727204117.50333: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204117.50344: Calling all_plugins_play to load vars for managed-node1 34589 1727204117.50347: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204117.50350: Calling groups_plugins_play to load vars for managed-node1 34589 1727204117.53125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204117.55756: done with get_vars() 34589 1727204117.55817: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:55:17 -0400 (0:00:00.772) 0:00:17.694 ***** 34589 1727204117.55914: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 34589 1727204117.55916: Creating lock for fedora.linux_system_roles.network_state 34589 1727204117.56500: worker is 1 (out of 1 available) 34589 1727204117.56514: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 34589 1727204117.56527: done queuing things up, now waiting for results queue to drain 34589 1727204117.56529: waiting for pending results... 34589 1727204117.56918: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 34589 1727204117.57037: in run() - task 028d2410-947f-a9c6-cddc-000000000028 34589 1727204117.57092: variable 'ansible_search_path' from source: unknown 34589 1727204117.57097: variable 'ansible_search_path' from source: unknown 34589 1727204117.57119: calling self._execute() 34589 1727204117.57315: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204117.57343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204117.57346: variable 'omit' from source: magic vars 34589 1727204117.57732: variable 'ansible_distribution_major_version' from source: facts 34589 1727204117.57782: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204117.57891: variable 'network_state' from source: role '' defaults 34589 1727204117.57909: Evaluated conditional (network_state != {}): False 34589 1727204117.57917: when evaluation is False, skipping this task 34589 1727204117.57925: _execute() done 34589 1727204117.57995: dumping result to json 34589 1727204117.58006: done dumping result, returning 34589 1727204117.58010: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [028d2410-947f-a9c6-cddc-000000000028] 34589 1727204117.58012: sending task result for task 028d2410-947f-a9c6-cddc-000000000028 34589 1727204117.58281: done sending task result for task 028d2410-947f-a9c6-cddc-000000000028 34589 1727204117.58286: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34589 1727204117.58345: no more pending results, returning what we have 34589 1727204117.58350: results queue empty 34589 1727204117.58350: checking for any_errors_fatal 34589 1727204117.58363: done checking for any_errors_fatal 34589 1727204117.58364: checking for max_fail_percentage 34589 1727204117.58366: done checking for max_fail_percentage 34589 1727204117.58367: checking to see if all hosts have failed and the running result is not ok 34589 1727204117.58367: done checking to see if all hosts have failed 34589 1727204117.58368: getting the remaining hosts for this loop 34589 1727204117.58370: done getting the remaining hosts for this loop 34589 1727204117.58373: getting the next task for host managed-node1 34589 1727204117.58383: done getting next task for host managed-node1 34589 1727204117.58387: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34589 1727204117.58391: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204117.58489: getting variables 34589 1727204117.58491: in VariableManager get_vars() 34589 1727204117.58530: Calling all_inventory to load vars for managed-node1 34589 1727204117.58533: Calling groups_inventory to load vars for managed-node1 34589 1727204117.58536: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204117.58544: Calling all_plugins_play to load vars for managed-node1 34589 1727204117.58547: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204117.58550: Calling groups_plugins_play to load vars for managed-node1 34589 1727204117.60447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204117.63103: done with get_vars() 34589 1727204117.63127: done getting variables 34589 1727204117.63353: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:55:17 -0400 (0:00:00.074) 0:00:17.768 ***** 34589 1727204117.63388: entering _queue_task() for managed-node1/debug 34589 1727204117.63956: worker is 1 (out of 1 available) 34589 1727204117.63970: exiting _queue_task() for managed-node1/debug 34589 1727204117.64087: done queuing things up, now waiting for results queue to drain 34589 1727204117.64090: waiting for pending results... 34589 1727204117.64569: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34589 1727204117.64890: in run() - task 028d2410-947f-a9c6-cddc-000000000029 34589 1727204117.64922: variable 'ansible_search_path' from source: unknown 34589 1727204117.64926: variable 'ansible_search_path' from source: unknown 34589 1727204117.64961: calling self._execute() 34589 1727204117.65058: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204117.65062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204117.65072: variable 'omit' from source: magic vars 34589 1727204117.65860: variable 'ansible_distribution_major_version' from source: facts 34589 1727204117.65868: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204117.65973: variable 'omit' from source: magic vars 34589 1727204117.66139: variable 'omit' from source: magic vars 34589 1727204117.66179: variable 'omit' from source: magic vars 34589 1727204117.66224: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204117.66258: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204117.66279: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204117.66501: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204117.66515: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204117.66545: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204117.66549: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204117.66551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204117.66656: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204117.66662: Set connection var ansible_shell_executable to /bin/sh 34589 1727204117.66671: Set connection var ansible_timeout to 10 34589 1727204117.66673: Set connection var ansible_shell_type to sh 34589 1727204117.66953: Set connection var ansible_connection to ssh 34589 1727204117.66957: Set connection var ansible_pipelining to False 34589 1727204117.66959: variable 'ansible_shell_executable' from source: unknown 34589 1727204117.66961: variable 'ansible_connection' from source: unknown 34589 1727204117.66964: variable 'ansible_module_compression' from source: unknown 34589 1727204117.66966: variable 'ansible_shell_type' from source: unknown 34589 1727204117.66967: variable 'ansible_shell_executable' from source: unknown 34589 1727204117.66969: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204117.66971: variable 'ansible_pipelining' from source: unknown 34589 1727204117.66973: variable 'ansible_timeout' from source: unknown 34589 1727204117.66977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204117.67080: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204117.67294: variable 'omit' from source: magic vars 34589 1727204117.67300: starting attempt loop 34589 1727204117.67302: running the handler 34589 1727204117.67437: variable '__network_connections_result' from source: set_fact 34589 1727204117.67690: handler run complete 34589 1727204117.67711: attempt loop complete, returning result 34589 1727204117.67715: _execute() done 34589 1727204117.67717: dumping result to json 34589 1727204117.67720: done dumping result, returning 34589 1727204117.67729: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [028d2410-947f-a9c6-cddc-000000000029] 34589 1727204117.67734: sending task result for task 028d2410-947f-a9c6-cddc-000000000029 34589 1727204117.68013: done sending task result for task 028d2410-947f-a9c6-cddc-000000000029 34589 1727204117.68016: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 0a64492c-4969-466c-9920-91c73029e796" ] } 34589 1727204117.68082: no more pending results, returning what we have 34589 1727204117.68086: results queue empty 34589 1727204117.68087: checking for any_errors_fatal 34589 1727204117.68093: done checking for any_errors_fatal 34589 1727204117.68094: checking for max_fail_percentage 34589 1727204117.68096: done checking for max_fail_percentage 34589 1727204117.68096: checking to see if all hosts have failed and the running result is not ok 34589 1727204117.68097: done checking to see if all hosts have failed 34589 1727204117.68098: getting the remaining hosts for this loop 34589 1727204117.68099: done getting the remaining hosts for this loop 34589 1727204117.68102: getting the next task for host managed-node1 34589 1727204117.68109: done getting next task for host managed-node1 34589 1727204117.68112: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34589 1727204117.68116: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204117.68129: getting variables 34589 1727204117.68131: in VariableManager get_vars() 34589 1727204117.68167: Calling all_inventory to load vars for managed-node1 34589 1727204117.68170: Calling groups_inventory to load vars for managed-node1 34589 1727204117.68172: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204117.68244: Calling all_plugins_play to load vars for managed-node1 34589 1727204117.68248: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204117.68251: Calling groups_plugins_play to load vars for managed-node1 34589 1727204117.70979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204117.74154: done with get_vars() 34589 1727204117.74245: done getting variables 34589 1727204117.74309: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:55:17 -0400 (0:00:00.109) 0:00:17.878 ***** 34589 1727204117.74456: entering _queue_task() for managed-node1/debug 34589 1727204117.75148: worker is 1 (out of 1 available) 34589 1727204117.75160: exiting _queue_task() for managed-node1/debug 34589 1727204117.75173: done queuing things up, now waiting for results queue to drain 34589 1727204117.75174: waiting for pending results... 34589 1727204117.75760: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34589 1727204117.76178: in run() - task 028d2410-947f-a9c6-cddc-00000000002a 34589 1727204117.76185: variable 'ansible_search_path' from source: unknown 34589 1727204117.76188: variable 'ansible_search_path' from source: unknown 34589 1727204117.76191: calling self._execute() 34589 1727204117.76307: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204117.76312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204117.76316: variable 'omit' from source: magic vars 34589 1727204117.77021: variable 'ansible_distribution_major_version' from source: facts 34589 1727204117.77108: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204117.77114: variable 'omit' from source: magic vars 34589 1727204117.77300: variable 'omit' from source: magic vars 34589 1727204117.77340: variable 'omit' from source: magic vars 34589 1727204117.77382: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204117.77419: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204117.77436: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204117.77453: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204117.77464: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204117.77698: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204117.77702: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204117.77704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204117.77807: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204117.77832: Set connection var ansible_shell_executable to /bin/sh 34589 1727204117.77835: Set connection var ansible_timeout to 10 34589 1727204117.77838: Set connection var ansible_shell_type to sh 34589 1727204117.77840: Set connection var ansible_connection to ssh 34589 1727204117.77842: Set connection var ansible_pipelining to False 34589 1727204117.77872: variable 'ansible_shell_executable' from source: unknown 34589 1727204117.77876: variable 'ansible_connection' from source: unknown 34589 1727204117.77879: variable 'ansible_module_compression' from source: unknown 34589 1727204117.77882: variable 'ansible_shell_type' from source: unknown 34589 1727204117.77884: variable 'ansible_shell_executable' from source: unknown 34589 1727204117.77886: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204117.77888: variable 'ansible_pipelining' from source: unknown 34589 1727204117.77890: variable 'ansible_timeout' from source: unknown 34589 1727204117.78160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204117.78228: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204117.78238: variable 'omit' from source: magic vars 34589 1727204117.78244: starting attempt loop 34589 1727204117.78247: running the handler 34589 1727204117.78498: variable '__network_connections_result' from source: set_fact 34589 1727204117.78579: variable '__network_connections_result' from source: set_fact 34589 1727204117.78893: handler run complete 34589 1727204117.78921: attempt loop complete, returning result 34589 1727204117.78924: _execute() done 34589 1727204117.78927: dumping result to json 34589 1727204117.78961: done dumping result, returning 34589 1727204117.78965: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [028d2410-947f-a9c6-cddc-00000000002a] 34589 1727204117.78967: sending task result for task 028d2410-947f-a9c6-cddc-00000000002a 34589 1727204117.79038: done sending task result for task 028d2410-947f-a9c6-cddc-00000000002a 34589 1727204117.79041: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "ethtest0", "ip": { "ipv6_disabled": true }, "name": "ethtest0", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 0a64492c-4969-466c-9920-91c73029e796\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 0a64492c-4969-466c-9920-91c73029e796" ] } } 34589 1727204117.79129: no more pending results, returning what we have 34589 1727204117.79133: results queue empty 34589 1727204117.79134: checking for any_errors_fatal 34589 1727204117.79142: done checking for any_errors_fatal 34589 1727204117.79143: checking for max_fail_percentage 34589 1727204117.79144: done checking for max_fail_percentage 34589 1727204117.79145: checking to see if all hosts have failed and the running result is not ok 34589 1727204117.79146: done checking to see if all hosts have failed 34589 1727204117.79146: getting the remaining hosts for this loop 34589 1727204117.79148: done getting the remaining hosts for this loop 34589 1727204117.79151: getting the next task for host managed-node1 34589 1727204117.79158: done getting next task for host managed-node1 34589 1727204117.79162: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34589 1727204117.79165: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204117.79285: getting variables 34589 1727204117.79287: in VariableManager get_vars() 34589 1727204117.79323: Calling all_inventory to load vars for managed-node1 34589 1727204117.79326: Calling groups_inventory to load vars for managed-node1 34589 1727204117.79328: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204117.79338: Calling all_plugins_play to load vars for managed-node1 34589 1727204117.79340: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204117.79343: Calling groups_plugins_play to load vars for managed-node1 34589 1727204117.82346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204117.85639: done with get_vars() 34589 1727204117.85670: done getting variables 34589 1727204117.85896: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:55:17 -0400 (0:00:00.115) 0:00:17.994 ***** 34589 1727204117.85930: entering _queue_task() for managed-node1/debug 34589 1727204117.86650: worker is 1 (out of 1 available) 34589 1727204117.86664: exiting _queue_task() for managed-node1/debug 34589 1727204117.86679: done queuing things up, now waiting for results queue to drain 34589 1727204117.86680: waiting for pending results... 34589 1727204117.87264: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34589 1727204117.87270: in run() - task 028d2410-947f-a9c6-cddc-00000000002b 34589 1727204117.87582: variable 'ansible_search_path' from source: unknown 34589 1727204117.87586: variable 'ansible_search_path' from source: unknown 34589 1727204117.87589: calling self._execute() 34589 1727204117.87624: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204117.87628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204117.87639: variable 'omit' from source: magic vars 34589 1727204117.88415: variable 'ansible_distribution_major_version' from source: facts 34589 1727204117.88426: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204117.88750: variable 'network_state' from source: role '' defaults 34589 1727204117.88759: Evaluated conditional (network_state != {}): False 34589 1727204117.88763: when evaluation is False, skipping this task 34589 1727204117.88773: _execute() done 34589 1727204117.88778: dumping result to json 34589 1727204117.88781: done dumping result, returning 34589 1727204117.88784: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [028d2410-947f-a9c6-cddc-00000000002b] 34589 1727204117.88786: sending task result for task 028d2410-947f-a9c6-cddc-00000000002b 34589 1727204117.88951: done sending task result for task 028d2410-947f-a9c6-cddc-00000000002b 34589 1727204117.88955: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "network_state != {}" } 34589 1727204117.89034: no more pending results, returning what we have 34589 1727204117.89038: results queue empty 34589 1727204117.89039: checking for any_errors_fatal 34589 1727204117.89052: done checking for any_errors_fatal 34589 1727204117.89053: checking for max_fail_percentage 34589 1727204117.89055: done checking for max_fail_percentage 34589 1727204117.89056: checking to see if all hosts have failed and the running result is not ok 34589 1727204117.89056: done checking to see if all hosts have failed 34589 1727204117.89057: getting the remaining hosts for this loop 34589 1727204117.89059: done getting the remaining hosts for this loop 34589 1727204117.89063: getting the next task for host managed-node1 34589 1727204117.89071: done getting next task for host managed-node1 34589 1727204117.89077: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 34589 1727204117.89080: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204117.89094: getting variables 34589 1727204117.89096: in VariableManager get_vars() 34589 1727204117.89133: Calling all_inventory to load vars for managed-node1 34589 1727204117.89136: Calling groups_inventory to load vars for managed-node1 34589 1727204117.89138: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204117.89150: Calling all_plugins_play to load vars for managed-node1 34589 1727204117.89152: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204117.89155: Calling groups_plugins_play to load vars for managed-node1 34589 1727204117.91995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204117.95425: done with get_vars() 34589 1727204117.95459: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:55:17 -0400 (0:00:00.096) 0:00:18.091 ***** 34589 1727204117.95621: entering _queue_task() for managed-node1/ping 34589 1727204117.95623: Creating lock for ping 34589 1727204117.96430: worker is 1 (out of 1 available) 34589 1727204117.96445: exiting _queue_task() for managed-node1/ping 34589 1727204117.96459: done queuing things up, now waiting for results queue to drain 34589 1727204117.96460: waiting for pending results... 34589 1727204117.96964: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 34589 1727204117.97199: in run() - task 028d2410-947f-a9c6-cddc-00000000002c 34589 1727204117.97219: variable 'ansible_search_path' from source: unknown 34589 1727204117.97224: variable 'ansible_search_path' from source: unknown 34589 1727204117.97258: calling self._execute() 34589 1727204117.97355: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204117.97362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204117.97384: variable 'omit' from source: magic vars 34589 1727204117.98147: variable 'ansible_distribution_major_version' from source: facts 34589 1727204117.98159: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204117.98166: variable 'omit' from source: magic vars 34589 1727204117.98430: variable 'omit' from source: magic vars 34589 1727204117.98465: variable 'omit' from source: magic vars 34589 1727204117.98517: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204117.98553: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204117.98573: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204117.98801: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204117.98807: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204117.98881: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204117.98884: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204117.98887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204117.98948: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204117.98954: Set connection var ansible_shell_executable to /bin/sh 34589 1727204117.98964: Set connection var ansible_timeout to 10 34589 1727204117.98967: Set connection var ansible_shell_type to sh 34589 1727204117.98973: Set connection var ansible_connection to ssh 34589 1727204117.99183: Set connection var ansible_pipelining to False 34589 1727204117.99236: variable 'ansible_shell_executable' from source: unknown 34589 1727204117.99239: variable 'ansible_connection' from source: unknown 34589 1727204117.99242: variable 'ansible_module_compression' from source: unknown 34589 1727204117.99244: variable 'ansible_shell_type' from source: unknown 34589 1727204117.99246: variable 'ansible_shell_executable' from source: unknown 34589 1727204117.99248: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204117.99250: variable 'ansible_pipelining' from source: unknown 34589 1727204117.99253: variable 'ansible_timeout' from source: unknown 34589 1727204117.99255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204117.99673: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34589 1727204117.99679: variable 'omit' from source: magic vars 34589 1727204117.99682: starting attempt loop 34589 1727204117.99684: running the handler 34589 1727204117.99686: _low_level_execute_command(): starting 34589 1727204117.99689: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204118.00956: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204118.00974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204118.01122: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204118.01125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204118.01128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204118.01132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204118.01134: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204118.01467: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204118.03223: stdout chunk (state=3): >>>/root <<< 34589 1727204118.03308: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204118.03312: stderr chunk (state=3): >>><<< 34589 1727204118.03317: stdout chunk (state=3): >>><<< 34589 1727204118.03342: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204118.03357: _low_level_execute_command(): starting 34589 1727204118.03363: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204118.033428-36643-171766533631569 `" && echo ansible-tmp-1727204118.033428-36643-171766533631569="` echo /root/.ansible/tmp/ansible-tmp-1727204118.033428-36643-171766533631569 `" ) && sleep 0' 34589 1727204118.04582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204118.04585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204118.04595: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34589 1727204118.04597: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204118.04600: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204118.04762: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204118.04765: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204118.04767: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204118.04964: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204118.07119: stdout chunk (state=3): >>>ansible-tmp-1727204118.033428-36643-171766533631569=/root/.ansible/tmp/ansible-tmp-1727204118.033428-36643-171766533631569 <<< 34589 1727204118.07236: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204118.07239: stderr chunk (state=3): >>><<< 34589 1727204118.07242: stdout chunk (state=3): >>><<< 34589 1727204118.07258: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204118.033428-36643-171766533631569=/root/.ansible/tmp/ansible-tmp-1727204118.033428-36643-171766533631569 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204118.07308: variable 'ansible_module_compression' from source: unknown 34589 1727204118.07351: ANSIBALLZ: Using lock for ping 34589 1727204118.07354: ANSIBALLZ: Acquiring lock 34589 1727204118.07357: ANSIBALLZ: Lock acquired: 140222010690736 34589 1727204118.07359: ANSIBALLZ: Creating module 34589 1727204118.31686: ANSIBALLZ: Writing module into payload 34589 1727204118.31713: ANSIBALLZ: Writing module 34589 1727204118.31744: ANSIBALLZ: Renaming module 34589 1727204118.31844: ANSIBALLZ: Done creating module 34589 1727204118.31847: variable 'ansible_facts' from source: unknown 34589 1727204118.31986: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204118.033428-36643-171766533631569/AnsiballZ_ping.py 34589 1727204118.32246: Sending initial data 34589 1727204118.32391: Sent initial data (152 bytes) 34589 1727204118.33706: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204118.33817: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204118.33839: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204118.34004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204118.36101: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204118.36174: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmpcx8sf2xz /root/.ansible/tmp/ansible-tmp-1727204118.033428-36643-171766533631569/AnsiballZ_ping.py <<< 34589 1727204118.36180: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204118.033428-36643-171766533631569/AnsiballZ_ping.py" <<< 34589 1727204118.36242: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmpcx8sf2xz" to remote "/root/.ansible/tmp/ansible-tmp-1727204118.033428-36643-171766533631569/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204118.033428-36643-171766533631569/AnsiballZ_ping.py" <<< 34589 1727204118.37546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204118.37682: stderr chunk (state=3): >>><<< 34589 1727204118.37686: stdout chunk (state=3): >>><<< 34589 1727204118.37842: done transferring module to remote 34589 1727204118.37845: _low_level_execute_command(): starting 34589 1727204118.37847: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204118.033428-36643-171766533631569/ /root/.ansible/tmp/ansible-tmp-1727204118.033428-36643-171766533631569/AnsiballZ_ping.py && sleep 0' 34589 1727204118.38913: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204118.39092: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204118.39199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204118.39278: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204118.39407: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204118.41433: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204118.41445: stdout chunk (state=3): >>><<< 34589 1727204118.41488: stderr chunk (state=3): >>><<< 34589 1727204118.41682: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204118.41685: _low_level_execute_command(): starting 34589 1727204118.41688: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204118.033428-36643-171766533631569/AnsiballZ_ping.py && sleep 0' 34589 1727204118.43295: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204118.43343: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204118.43378: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204118.43426: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204118.43636: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204118.59901: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 34589 1727204118.61281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204118.61343: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 34589 1727204118.61354: stdout chunk (state=3): >>><<< 34589 1727204118.61370: stderr chunk (state=3): >>><<< 34589 1727204118.61682: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204118.61687: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204118.033428-36643-171766533631569/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204118.61690: _low_level_execute_command(): starting 34589 1727204118.61692: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204118.033428-36643-171766533631569/ > /dev/null 2>&1 && sleep 0' 34589 1727204118.63051: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204118.63409: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204118.63424: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204118.63432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204118.63709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204118.65700: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204118.65748: stderr chunk (state=3): >>><<< 34589 1727204118.65798: stdout chunk (state=3): >>><<< 34589 1727204118.65910: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204118.65929: handler run complete 34589 1727204118.65946: attempt loop complete, returning result 34589 1727204118.65952: _execute() done 34589 1727204118.65958: dumping result to json 34589 1727204118.65965: done dumping result, returning 34589 1727204118.65978: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [028d2410-947f-a9c6-cddc-00000000002c] 34589 1727204118.65986: sending task result for task 028d2410-947f-a9c6-cddc-00000000002c ok: [managed-node1] => { "changed": false, "ping": "pong" } 34589 1727204118.66179: no more pending results, returning what we have 34589 1727204118.66182: results queue empty 34589 1727204118.66183: checking for any_errors_fatal 34589 1727204118.66190: done checking for any_errors_fatal 34589 1727204118.66191: checking for max_fail_percentage 34589 1727204118.66193: done checking for max_fail_percentage 34589 1727204118.66193: checking to see if all hosts have failed and the running result is not ok 34589 1727204118.66194: done checking to see if all hosts have failed 34589 1727204118.66195: getting the remaining hosts for this loop 34589 1727204118.66196: done getting the remaining hosts for this loop 34589 1727204118.66200: getting the next task for host managed-node1 34589 1727204118.66323: done getting next task for host managed-node1 34589 1727204118.66327: ^ task is: TASK: meta (role_complete) 34589 1727204118.66330: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204118.66344: getting variables 34589 1727204118.66346: in VariableManager get_vars() 34589 1727204118.66387: Calling all_inventory to load vars for managed-node1 34589 1727204118.66390: Calling groups_inventory to load vars for managed-node1 34589 1727204118.66392: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204118.66402: Calling all_plugins_play to load vars for managed-node1 34589 1727204118.66405: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204118.66408: Calling groups_plugins_play to load vars for managed-node1 34589 1727204118.66958: done sending task result for task 028d2410-947f-a9c6-cddc-00000000002c 34589 1727204118.66962: WORKER PROCESS EXITING 34589 1727204118.70097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204118.73478: done with get_vars() 34589 1727204118.73506: done getting variables 34589 1727204118.73711: done queuing things up, now waiting for results queue to drain 34589 1727204118.73713: results queue empty 34589 1727204118.73714: checking for any_errors_fatal 34589 1727204118.73717: done checking for any_errors_fatal 34589 1727204118.73718: checking for max_fail_percentage 34589 1727204118.73719: done checking for max_fail_percentage 34589 1727204118.73720: checking to see if all hosts have failed and the running result is not ok 34589 1727204118.73721: done checking to see if all hosts have failed 34589 1727204118.73721: getting the remaining hosts for this loop 34589 1727204118.73722: done getting the remaining hosts for this loop 34589 1727204118.73725: getting the next task for host managed-node1 34589 1727204118.73729: done getting next task for host managed-node1 34589 1727204118.73731: ^ task is: TASK: Assert that configuring `ipv6_disabled` will only fail when the running version of NetworKManager does not support it 34589 1727204118.73733: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204118.73735: getting variables 34589 1727204118.73736: in VariableManager get_vars() 34589 1727204118.73748: Calling all_inventory to load vars for managed-node1 34589 1727204118.73751: Calling groups_inventory to load vars for managed-node1 34589 1727204118.73753: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204118.73871: Calling all_plugins_play to load vars for managed-node1 34589 1727204118.73875: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204118.73879: Calling groups_plugins_play to load vars for managed-node1 34589 1727204118.76246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204118.79796: done with get_vars() 34589 1727204118.79819: done getting variables 34589 1727204118.79991: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that configuring `ipv6_disabled` will only fail when the running version of NetworKManager does not support it] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:41 Tuesday 24 September 2024 14:55:18 -0400 (0:00:00.843) 0:00:18.935 ***** 34589 1727204118.80021: entering _queue_task() for managed-node1/assert 34589 1727204118.80816: worker is 1 (out of 1 available) 34589 1727204118.80948: exiting _queue_task() for managed-node1/assert 34589 1727204118.80962: done queuing things up, now waiting for results queue to drain 34589 1727204118.80964: waiting for pending results... 34589 1727204118.81242: running TaskExecutor() for managed-node1/TASK: Assert that configuring `ipv6_disabled` will only fail when the running version of NetworKManager does not support it 34589 1727204118.81444: in run() - task 028d2410-947f-a9c6-cddc-00000000005c 34589 1727204118.81572: variable 'ansible_search_path' from source: unknown 34589 1727204118.81609: calling self._execute() 34589 1727204118.81816: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204118.81821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204118.81828: variable 'omit' from source: magic vars 34589 1727204118.82473: variable 'ansible_distribution_major_version' from source: facts 34589 1727204118.82695: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204118.82895: variable '__network_connections_result' from source: set_fact 34589 1727204118.82923: Evaluated conditional (__network_connections_result.failed): False 34589 1727204118.82926: when evaluation is False, skipping this task 34589 1727204118.82929: _execute() done 34589 1727204118.82932: dumping result to json 34589 1727204118.82935: done dumping result, returning 34589 1727204118.82938: done running TaskExecutor() for managed-node1/TASK: Assert that configuring `ipv6_disabled` will only fail when the running version of NetworKManager does not support it [028d2410-947f-a9c6-cddc-00000000005c] 34589 1727204118.83018: sending task result for task 028d2410-947f-a9c6-cddc-00000000005c 34589 1727204118.83092: done sending task result for task 028d2410-947f-a9c6-cddc-00000000005c skipping: [managed-node1] => { "changed": false, "false_condition": "__network_connections_result.failed", "skip_reason": "Conditional result was False" } 34589 1727204118.83151: no more pending results, returning what we have 34589 1727204118.83155: results queue empty 34589 1727204118.83156: checking for any_errors_fatal 34589 1727204118.83158: done checking for any_errors_fatal 34589 1727204118.83159: checking for max_fail_percentage 34589 1727204118.83160: done checking for max_fail_percentage 34589 1727204118.83161: checking to see if all hosts have failed and the running result is not ok 34589 1727204118.83162: done checking to see if all hosts have failed 34589 1727204118.83162: getting the remaining hosts for this loop 34589 1727204118.83163: done getting the remaining hosts for this loop 34589 1727204118.83167: getting the next task for host managed-node1 34589 1727204118.83173: done getting next task for host managed-node1 34589 1727204118.83178: ^ task is: TASK: Verify nmcli connection ipv6.method 34589 1727204118.83180: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204118.83184: getting variables 34589 1727204118.83185: in VariableManager get_vars() 34589 1727204118.83222: Calling all_inventory to load vars for managed-node1 34589 1727204118.83224: Calling groups_inventory to load vars for managed-node1 34589 1727204118.83228: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204118.83242: Calling all_plugins_play to load vars for managed-node1 34589 1727204118.83245: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204118.83248: Calling groups_plugins_play to load vars for managed-node1 34589 1727204118.83895: WORKER PROCESS EXITING 34589 1727204118.86074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204118.88617: done with get_vars() 34589 1727204118.88651: done getting variables 34589 1727204118.88752: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Verify nmcli connection ipv6.method] ************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:48 Tuesday 24 September 2024 14:55:18 -0400 (0:00:00.087) 0:00:19.022 ***** 34589 1727204118.88783: entering _queue_task() for managed-node1/shell 34589 1727204118.88785: Creating lock for shell 34589 1727204118.89158: worker is 1 (out of 1 available) 34589 1727204118.89171: exiting _queue_task() for managed-node1/shell 34589 1727204118.89307: done queuing things up, now waiting for results queue to drain 34589 1727204118.89309: waiting for pending results... 34589 1727204118.89487: running TaskExecutor() for managed-node1/TASK: Verify nmcli connection ipv6.method 34589 1727204118.89595: in run() - task 028d2410-947f-a9c6-cddc-00000000005d 34589 1727204118.89615: variable 'ansible_search_path' from source: unknown 34589 1727204118.89682: calling self._execute() 34589 1727204118.89825: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204118.89839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204118.89865: variable 'omit' from source: magic vars 34589 1727204118.90399: variable 'ansible_distribution_major_version' from source: facts 34589 1727204118.90501: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204118.90839: variable '__network_connections_result' from source: set_fact 34589 1727204118.90843: Evaluated conditional (not __network_connections_result.failed): True 34589 1727204118.90846: variable 'omit' from source: magic vars 34589 1727204118.90849: variable 'omit' from source: magic vars 34589 1727204118.91036: variable 'interface' from source: set_fact 34589 1727204118.91183: variable 'omit' from source: magic vars 34589 1727204118.91272: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204118.91278: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204118.91404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204118.91429: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204118.91445: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204118.91483: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204118.91527: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204118.91611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204118.91724: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204118.91740: Set connection var ansible_shell_executable to /bin/sh 34589 1727204118.91754: Set connection var ansible_timeout to 10 34589 1727204118.91762: Set connection var ansible_shell_type to sh 34589 1727204118.91774: Set connection var ansible_connection to ssh 34589 1727204118.91789: Set connection var ansible_pipelining to False 34589 1727204118.91828: variable 'ansible_shell_executable' from source: unknown 34589 1727204118.91836: variable 'ansible_connection' from source: unknown 34589 1727204118.91849: variable 'ansible_module_compression' from source: unknown 34589 1727204118.91855: variable 'ansible_shell_type' from source: unknown 34589 1727204118.91861: variable 'ansible_shell_executable' from source: unknown 34589 1727204118.91867: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204118.91926: variable 'ansible_pipelining' from source: unknown 34589 1727204118.91930: variable 'ansible_timeout' from source: unknown 34589 1727204118.91933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204118.92047: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204118.92069: variable 'omit' from source: magic vars 34589 1727204118.92083: starting attempt loop 34589 1727204118.92091: running the handler 34589 1727204118.92106: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204118.92131: _low_level_execute_command(): starting 34589 1727204118.92152: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204118.92906: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204118.92920: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204118.92993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204118.93022: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204118.93041: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204118.93053: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204118.93160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204118.95183: stdout chunk (state=3): >>>/root <<< 34589 1727204118.95253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204118.95257: stdout chunk (state=3): >>><<< 34589 1727204118.95266: stderr chunk (state=3): >>><<< 34589 1727204118.95291: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204118.95367: _low_level_execute_command(): starting 34589 1727204118.95373: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204118.953494-36670-42386257392552 `" && echo ansible-tmp-1727204118.953494-36670-42386257392552="` echo /root/.ansible/tmp/ansible-tmp-1727204118.953494-36670-42386257392552 `" ) && sleep 0' 34589 1727204118.96620: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204118.96695: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204118.96712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204118.96732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204118.96981: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204118.97019: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204118.97106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204118.99240: stdout chunk (state=3): >>>ansible-tmp-1727204118.953494-36670-42386257392552=/root/.ansible/tmp/ansible-tmp-1727204118.953494-36670-42386257392552 <<< 34589 1727204118.99422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204118.99461: stdout chunk (state=3): >>><<< 34589 1727204118.99474: stderr chunk (state=3): >>><<< 34589 1727204118.99803: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204118.953494-36670-42386257392552=/root/.ansible/tmp/ansible-tmp-1727204118.953494-36670-42386257392552 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204118.99806: variable 'ansible_module_compression' from source: unknown 34589 1727204118.99808: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34589 1727204118.99810: variable 'ansible_facts' from source: unknown 34589 1727204118.99949: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204118.953494-36670-42386257392552/AnsiballZ_command.py 34589 1727204119.00240: Sending initial data 34589 1727204119.00288: Sent initial data (154 bytes) 34589 1727204119.01008: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204119.01021: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204119.01093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204119.01127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204119.01139: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204119.01147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204119.01256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204119.03046: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204119.03135: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204119.03228: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmpyfkynhny /root/.ansible/tmp/ansible-tmp-1727204118.953494-36670-42386257392552/AnsiballZ_command.py <<< 34589 1727204119.03232: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204118.953494-36670-42386257392552/AnsiballZ_command.py" <<< 34589 1727204119.03306: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmpyfkynhny" to remote "/root/.ansible/tmp/ansible-tmp-1727204118.953494-36670-42386257392552/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204118.953494-36670-42386257392552/AnsiballZ_command.py" <<< 34589 1727204119.04223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204119.04421: stderr chunk (state=3): >>><<< 34589 1727204119.04424: stdout chunk (state=3): >>><<< 34589 1727204119.04427: done transferring module to remote 34589 1727204119.04429: _low_level_execute_command(): starting 34589 1727204119.04431: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204118.953494-36670-42386257392552/ /root/.ansible/tmp/ansible-tmp-1727204118.953494-36670-42386257392552/AnsiballZ_command.py && sleep 0' 34589 1727204119.05060: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204119.05201: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204119.05205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204119.05237: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204119.05351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204119.07381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204119.07385: stdout chunk (state=3): >>><<< 34589 1727204119.07387: stderr chunk (state=3): >>><<< 34589 1727204119.07390: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204119.07393: _low_level_execute_command(): starting 34589 1727204119.07396: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204118.953494-36670-42386257392552/AnsiballZ_command.py && sleep 0' 34589 1727204119.07757: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204119.07798: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204119.07801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204119.07803: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 34589 1727204119.07805: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204119.07807: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204119.07849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204119.07863: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204119.07951: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204119.26286: stdout chunk (state=3): >>> {"changed": true, "stdout": "ipv6.method: disabled", "stderr": "+ nmcli connection show ethtest0\n+ grep ipv6.method", "rc": 0, "cmd": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "start": "2024-09-24 14:55:19.242386", "end": "2024-09-24 14:55:19.260956", "delta": "0:00:00.018570", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34589 1727204119.28150: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204119.28156: stdout chunk (state=3): >>><<< 34589 1727204119.28159: stderr chunk (state=3): >>><<< 34589 1727204119.28289: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "ipv6.method: disabled", "stderr": "+ nmcli connection show ethtest0\n+ grep ipv6.method", "rc": 0, "cmd": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "start": "2024-09-24 14:55:19.242386", "end": "2024-09-24 14:55:19.260956", "delta": "0:00:00.018570", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204119.28294: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204118.953494-36670-42386257392552/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204119.28297: _low_level_execute_command(): starting 34589 1727204119.28300: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204118.953494-36670-42386257392552/ > /dev/null 2>&1 && sleep 0' 34589 1727204119.29351: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204119.29466: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204119.29555: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204119.29679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204119.31652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204119.31655: stdout chunk (state=3): >>><<< 34589 1727204119.31658: stderr chunk (state=3): >>><<< 34589 1727204119.31684: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204119.31881: handler run complete 34589 1727204119.31885: Evaluated conditional (False): False 34589 1727204119.31895: attempt loop complete, returning result 34589 1727204119.31899: _execute() done 34589 1727204119.31905: dumping result to json 34589 1727204119.31911: done dumping result, returning 34589 1727204119.31914: done running TaskExecutor() for managed-node1/TASK: Verify nmcli connection ipv6.method [028d2410-947f-a9c6-cddc-00000000005d] 34589 1727204119.31917: sending task result for task 028d2410-947f-a9c6-cddc-00000000005d 34589 1727204119.31999: done sending task result for task 028d2410-947f-a9c6-cddc-00000000005d 34589 1727204119.32003: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "delta": "0:00:00.018570", "end": "2024-09-24 14:55:19.260956", "rc": 0, "start": "2024-09-24 14:55:19.242386" } STDOUT: ipv6.method: disabled STDERR: + nmcli connection show ethtest0 + grep ipv6.method 34589 1727204119.32081: no more pending results, returning what we have 34589 1727204119.32084: results queue empty 34589 1727204119.32085: checking for any_errors_fatal 34589 1727204119.32094: done checking for any_errors_fatal 34589 1727204119.32094: checking for max_fail_percentage 34589 1727204119.32096: done checking for max_fail_percentage 34589 1727204119.32097: checking to see if all hosts have failed and the running result is not ok 34589 1727204119.32098: done checking to see if all hosts have failed 34589 1727204119.32098: getting the remaining hosts for this loop 34589 1727204119.32100: done getting the remaining hosts for this loop 34589 1727204119.32105: getting the next task for host managed-node1 34589 1727204119.32113: done getting next task for host managed-node1 34589 1727204119.32115: ^ task is: TASK: Assert that ipv6.method disabled is configured correctly 34589 1727204119.32117: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204119.32121: getting variables 34589 1727204119.32122: in VariableManager get_vars() 34589 1727204119.32165: Calling all_inventory to load vars for managed-node1 34589 1727204119.32168: Calling groups_inventory to load vars for managed-node1 34589 1727204119.32170: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204119.32294: Calling all_plugins_play to load vars for managed-node1 34589 1727204119.32298: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204119.32303: Calling groups_plugins_play to load vars for managed-node1 34589 1727204119.33799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204119.35973: done with get_vars() 34589 1727204119.36000: done getting variables 34589 1727204119.36078: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that ipv6.method disabled is configured correctly] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:57 Tuesday 24 September 2024 14:55:19 -0400 (0:00:00.473) 0:00:19.495 ***** 34589 1727204119.36105: entering _queue_task() for managed-node1/assert 34589 1727204119.36428: worker is 1 (out of 1 available) 34589 1727204119.36440: exiting _queue_task() for managed-node1/assert 34589 1727204119.36451: done queuing things up, now waiting for results queue to drain 34589 1727204119.36453: waiting for pending results... 34589 1727204119.36774: running TaskExecutor() for managed-node1/TASK: Assert that ipv6.method disabled is configured correctly 34589 1727204119.36849: in run() - task 028d2410-947f-a9c6-cddc-00000000005e 34589 1727204119.36901: variable 'ansible_search_path' from source: unknown 34589 1727204119.36990: calling self._execute() 34589 1727204119.37063: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204119.37087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204119.37131: variable 'omit' from source: magic vars 34589 1727204119.37606: variable 'ansible_distribution_major_version' from source: facts 34589 1727204119.37639: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204119.37783: variable '__network_connections_result' from source: set_fact 34589 1727204119.37791: Evaluated conditional (not __network_connections_result.failed): True 34589 1727204119.37801: variable 'omit' from source: magic vars 34589 1727204119.37890: variable 'omit' from source: magic vars 34589 1727204119.37893: variable 'omit' from source: magic vars 34589 1727204119.37907: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204119.37945: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204119.37966: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204119.38008: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204119.38025: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204119.38058: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204119.38066: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204119.38073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204119.38177: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204119.38190: Set connection var ansible_shell_executable to /bin/sh 34589 1727204119.38202: Set connection var ansible_timeout to 10 34589 1727204119.38211: Set connection var ansible_shell_type to sh 34589 1727204119.38225: Set connection var ansible_connection to ssh 34589 1727204119.38361: Set connection var ansible_pipelining to False 34589 1727204119.38364: variable 'ansible_shell_executable' from source: unknown 34589 1727204119.38367: variable 'ansible_connection' from source: unknown 34589 1727204119.38369: variable 'ansible_module_compression' from source: unknown 34589 1727204119.38371: variable 'ansible_shell_type' from source: unknown 34589 1727204119.38372: variable 'ansible_shell_executable' from source: unknown 34589 1727204119.38374: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204119.38378: variable 'ansible_pipelining' from source: unknown 34589 1727204119.38380: variable 'ansible_timeout' from source: unknown 34589 1727204119.38382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204119.38461: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204119.38610: variable 'omit' from source: magic vars 34589 1727204119.38615: starting attempt loop 34589 1727204119.38618: running the handler 34589 1727204119.38688: variable 'ipv6_method' from source: set_fact 34589 1727204119.38703: Evaluated conditional ('disabled' in ipv6_method.stdout): True 34589 1727204119.38715: handler run complete 34589 1727204119.38737: attempt loop complete, returning result 34589 1727204119.38754: _execute() done 34589 1727204119.38764: dumping result to json 34589 1727204119.38772: done dumping result, returning 34589 1727204119.38785: done running TaskExecutor() for managed-node1/TASK: Assert that ipv6.method disabled is configured correctly [028d2410-947f-a9c6-cddc-00000000005e] 34589 1727204119.38793: sending task result for task 028d2410-947f-a9c6-cddc-00000000005e 34589 1727204119.39219: done sending task result for task 028d2410-947f-a9c6-cddc-00000000005e 34589 1727204119.39229: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 34589 1727204119.39278: no more pending results, returning what we have 34589 1727204119.39281: results queue empty 34589 1727204119.39281: checking for any_errors_fatal 34589 1727204119.39288: done checking for any_errors_fatal 34589 1727204119.39288: checking for max_fail_percentage 34589 1727204119.39290: done checking for max_fail_percentage 34589 1727204119.39290: checking to see if all hosts have failed and the running result is not ok 34589 1727204119.39291: done checking to see if all hosts have failed 34589 1727204119.39292: getting the remaining hosts for this loop 34589 1727204119.39293: done getting the remaining hosts for this loop 34589 1727204119.39296: getting the next task for host managed-node1 34589 1727204119.39300: done getting next task for host managed-node1 34589 1727204119.39302: ^ task is: TASK: Set the connection_failed flag 34589 1727204119.39304: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204119.39307: getting variables 34589 1727204119.39308: in VariableManager get_vars() 34589 1727204119.39345: Calling all_inventory to load vars for managed-node1 34589 1727204119.39347: Calling groups_inventory to load vars for managed-node1 34589 1727204119.39350: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204119.39358: Calling all_plugins_play to load vars for managed-node1 34589 1727204119.39361: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204119.39363: Calling groups_plugins_play to load vars for managed-node1 34589 1727204119.41623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204119.44868: done with get_vars() 34589 1727204119.44910: done getting variables 34589 1727204119.44970: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set the connection_failed flag] ****************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:64 Tuesday 24 September 2024 14:55:19 -0400 (0:00:00.091) 0:00:19.587 ***** 34589 1727204119.45270: entering _queue_task() for managed-node1/set_fact 34589 1727204119.46214: worker is 1 (out of 1 available) 34589 1727204119.46225: exiting _queue_task() for managed-node1/set_fact 34589 1727204119.46237: done queuing things up, now waiting for results queue to drain 34589 1727204119.46239: waiting for pending results... 34589 1727204119.46794: running TaskExecutor() for managed-node1/TASK: Set the connection_failed flag 34589 1727204119.47118: in run() - task 028d2410-947f-a9c6-cddc-00000000005f 34589 1727204119.47122: variable 'ansible_search_path' from source: unknown 34589 1727204119.47179: calling self._execute() 34589 1727204119.47386: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204119.47457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204119.47460: variable 'omit' from source: magic vars 34589 1727204119.47948: variable 'ansible_distribution_major_version' from source: facts 34589 1727204119.47965: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204119.48151: variable '__network_connections_result' from source: set_fact 34589 1727204119.48177: Evaluated conditional (__network_connections_result.failed): False 34589 1727204119.48187: when evaluation is False, skipping this task 34589 1727204119.48195: _execute() done 34589 1727204119.48203: dumping result to json 34589 1727204119.48237: done dumping result, returning 34589 1727204119.48247: done running TaskExecutor() for managed-node1/TASK: Set the connection_failed flag [028d2410-947f-a9c6-cddc-00000000005f] 34589 1727204119.48280: sending task result for task 028d2410-947f-a9c6-cddc-00000000005f 34589 1727204119.48458: done sending task result for task 028d2410-947f-a9c6-cddc-00000000005f 34589 1727204119.48462: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_connections_result.failed", "skip_reason": "Conditional result was False" } 34589 1727204119.48525: no more pending results, returning what we have 34589 1727204119.48529: results queue empty 34589 1727204119.48530: checking for any_errors_fatal 34589 1727204119.48540: done checking for any_errors_fatal 34589 1727204119.48541: checking for max_fail_percentage 34589 1727204119.48543: done checking for max_fail_percentage 34589 1727204119.48544: checking to see if all hosts have failed and the running result is not ok 34589 1727204119.48545: done checking to see if all hosts have failed 34589 1727204119.48546: getting the remaining hosts for this loop 34589 1727204119.48547: done getting the remaining hosts for this loop 34589 1727204119.48551: getting the next task for host managed-node1 34589 1727204119.48560: done getting next task for host managed-node1 34589 1727204119.48562: ^ task is: TASK: meta (flush_handlers) 34589 1727204119.48565: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204119.48579: getting variables 34589 1727204119.48582: in VariableManager get_vars() 34589 1727204119.48624: Calling all_inventory to load vars for managed-node1 34589 1727204119.48628: Calling groups_inventory to load vars for managed-node1 34589 1727204119.48630: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204119.48643: Calling all_plugins_play to load vars for managed-node1 34589 1727204119.48646: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204119.48649: Calling groups_plugins_play to load vars for managed-node1 34589 1727204119.56946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204119.59892: done with get_vars() 34589 1727204119.59930: done getting variables 34589 1727204119.60015: in VariableManager get_vars() 34589 1727204119.60028: Calling all_inventory to load vars for managed-node1 34589 1727204119.60038: Calling groups_inventory to load vars for managed-node1 34589 1727204119.60041: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204119.60046: Calling all_plugins_play to load vars for managed-node1 34589 1727204119.60048: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204119.60051: Calling groups_plugins_play to load vars for managed-node1 34589 1727204119.61444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204119.63199: done with get_vars() 34589 1727204119.63231: done queuing things up, now waiting for results queue to drain 34589 1727204119.63233: results queue empty 34589 1727204119.63234: checking for any_errors_fatal 34589 1727204119.63237: done checking for any_errors_fatal 34589 1727204119.63238: checking for max_fail_percentage 34589 1727204119.63239: done checking for max_fail_percentage 34589 1727204119.63240: checking to see if all hosts have failed and the running result is not ok 34589 1727204119.63240: done checking to see if all hosts have failed 34589 1727204119.63241: getting the remaining hosts for this loop 34589 1727204119.63242: done getting the remaining hosts for this loop 34589 1727204119.63245: getting the next task for host managed-node1 34589 1727204119.63249: done getting next task for host managed-node1 34589 1727204119.63250: ^ task is: TASK: meta (flush_handlers) 34589 1727204119.63252: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204119.63255: getting variables 34589 1727204119.63256: in VariableManager get_vars() 34589 1727204119.63268: Calling all_inventory to load vars for managed-node1 34589 1727204119.63270: Calling groups_inventory to load vars for managed-node1 34589 1727204119.63272: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204119.63280: Calling all_plugins_play to load vars for managed-node1 34589 1727204119.63282: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204119.63285: Calling groups_plugins_play to load vars for managed-node1 34589 1727204119.64555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204119.66329: done with get_vars() 34589 1727204119.66357: done getting variables 34589 1727204119.66420: in VariableManager get_vars() 34589 1727204119.66434: Calling all_inventory to load vars for managed-node1 34589 1727204119.66436: Calling groups_inventory to load vars for managed-node1 34589 1727204119.66438: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204119.66448: Calling all_plugins_play to load vars for managed-node1 34589 1727204119.66451: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204119.66454: Calling groups_plugins_play to load vars for managed-node1 34589 1727204119.68309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204119.72029: done with get_vars() 34589 1727204119.72058: done queuing things up, now waiting for results queue to drain 34589 1727204119.72061: results queue empty 34589 1727204119.72062: checking for any_errors_fatal 34589 1727204119.72064: done checking for any_errors_fatal 34589 1727204119.72064: checking for max_fail_percentage 34589 1727204119.72065: done checking for max_fail_percentage 34589 1727204119.72066: checking to see if all hosts have failed and the running result is not ok 34589 1727204119.72067: done checking to see if all hosts have failed 34589 1727204119.72068: getting the remaining hosts for this loop 34589 1727204119.72068: done getting the remaining hosts for this loop 34589 1727204119.72071: getting the next task for host managed-node1 34589 1727204119.72082: done getting next task for host managed-node1 34589 1727204119.72083: ^ task is: None 34589 1727204119.72085: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204119.72086: done queuing things up, now waiting for results queue to drain 34589 1727204119.72087: results queue empty 34589 1727204119.72088: checking for any_errors_fatal 34589 1727204119.72088: done checking for any_errors_fatal 34589 1727204119.72089: checking for max_fail_percentage 34589 1727204119.72090: done checking for max_fail_percentage 34589 1727204119.72091: checking to see if all hosts have failed and the running result is not ok 34589 1727204119.72091: done checking to see if all hosts have failed 34589 1727204119.72093: getting the next task for host managed-node1 34589 1727204119.72096: done getting next task for host managed-node1 34589 1727204119.72096: ^ task is: None 34589 1727204119.72098: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204119.72503: in VariableManager get_vars() 34589 1727204119.72522: done with get_vars() 34589 1727204119.72528: in VariableManager get_vars() 34589 1727204119.72541: done with get_vars() 34589 1727204119.72545: variable 'omit' from source: magic vars 34589 1727204119.72913: variable 'profile' from source: play vars 34589 1727204119.73106: in VariableManager get_vars() 34589 1727204119.73153: done with get_vars() 34589 1727204119.73171: variable 'omit' from source: magic vars 34589 1727204119.73367: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 34589 1727204119.74728: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 34589 1727204119.74790: getting the remaining hosts for this loop 34589 1727204119.74792: done getting the remaining hosts for this loop 34589 1727204119.74794: getting the next task for host managed-node1 34589 1727204119.74997: done getting next task for host managed-node1 34589 1727204119.75000: ^ task is: TASK: Gathering Facts 34589 1727204119.75002: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204119.75004: getting variables 34589 1727204119.75005: in VariableManager get_vars() 34589 1727204119.75018: Calling all_inventory to load vars for managed-node1 34589 1727204119.75021: Calling groups_inventory to load vars for managed-node1 34589 1727204119.75023: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204119.75029: Calling all_plugins_play to load vars for managed-node1 34589 1727204119.75031: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204119.75034: Calling groups_plugins_play to load vars for managed-node1 34589 1727204119.77472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204119.80629: done with get_vars() 34589 1727204119.80656: done getting variables 34589 1727204119.80704: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Tuesday 24 September 2024 14:55:19 -0400 (0:00:00.354) 0:00:19.942 ***** 34589 1727204119.80730: entering _queue_task() for managed-node1/gather_facts 34589 1727204119.81474: worker is 1 (out of 1 available) 34589 1727204119.81488: exiting _queue_task() for managed-node1/gather_facts 34589 1727204119.81502: done queuing things up, now waiting for results queue to drain 34589 1727204119.81503: waiting for pending results... 34589 1727204119.81955: running TaskExecutor() for managed-node1/TASK: Gathering Facts 34589 1727204119.82213: in run() - task 028d2410-947f-a9c6-cddc-000000000454 34589 1727204119.82318: variable 'ansible_search_path' from source: unknown 34589 1727204119.82378: calling self._execute() 34589 1727204119.82656: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204119.82660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204119.82663: variable 'omit' from source: magic vars 34589 1727204119.83473: variable 'ansible_distribution_major_version' from source: facts 34589 1727204119.83525: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204119.83568: variable 'omit' from source: magic vars 34589 1727204119.83601: variable 'omit' from source: magic vars 34589 1727204119.83647: variable 'omit' from source: magic vars 34589 1727204119.83702: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204119.83751: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204119.83786: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204119.83813: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204119.83836: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204119.83871: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204119.83885: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204119.83893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204119.84014: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204119.84027: Set connection var ansible_shell_executable to /bin/sh 34589 1727204119.84040: Set connection var ansible_timeout to 10 34589 1727204119.84099: Set connection var ansible_shell_type to sh 34589 1727204119.84102: Set connection var ansible_connection to ssh 34589 1727204119.84104: Set connection var ansible_pipelining to False 34589 1727204119.84109: variable 'ansible_shell_executable' from source: unknown 34589 1727204119.84111: variable 'ansible_connection' from source: unknown 34589 1727204119.84118: variable 'ansible_module_compression' from source: unknown 34589 1727204119.84126: variable 'ansible_shell_type' from source: unknown 34589 1727204119.84133: variable 'ansible_shell_executable' from source: unknown 34589 1727204119.84139: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204119.84148: variable 'ansible_pipelining' from source: unknown 34589 1727204119.84160: variable 'ansible_timeout' from source: unknown 34589 1727204119.84168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204119.84380: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204119.84389: variable 'omit' from source: magic vars 34589 1727204119.84425: starting attempt loop 34589 1727204119.84428: running the handler 34589 1727204119.84434: variable 'ansible_facts' from source: unknown 34589 1727204119.84460: _low_level_execute_command(): starting 34589 1727204119.84474: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204119.85273: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204119.85305: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204119.85367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204119.85436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204119.85466: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204119.85627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204119.87420: stdout chunk (state=3): >>>/root <<< 34589 1727204119.87574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204119.87579: stdout chunk (state=3): >>><<< 34589 1727204119.87582: stderr chunk (state=3): >>><<< 34589 1727204119.87711: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204119.87715: _low_level_execute_command(): starting 34589 1727204119.87718: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204119.876218-36711-90969665355110 `" && echo ansible-tmp-1727204119.876218-36711-90969665355110="` echo /root/.ansible/tmp/ansible-tmp-1727204119.876218-36711-90969665355110 `" ) && sleep 0' 34589 1727204119.88489: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204119.88492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204119.88494: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 34589 1727204119.88502: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204119.88512: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204119.88548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204119.88580: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204119.88690: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204119.90804: stdout chunk (state=3): >>>ansible-tmp-1727204119.876218-36711-90969665355110=/root/.ansible/tmp/ansible-tmp-1727204119.876218-36711-90969665355110 <<< 34589 1727204119.91023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204119.91027: stdout chunk (state=3): >>><<< 34589 1727204119.91030: stderr chunk (state=3): >>><<< 34589 1727204119.91054: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204119.876218-36711-90969665355110=/root/.ansible/tmp/ansible-tmp-1727204119.876218-36711-90969665355110 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204119.91180: variable 'ansible_module_compression' from source: unknown 34589 1727204119.91184: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 34589 1727204119.91284: variable 'ansible_facts' from source: unknown 34589 1727204119.91494: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204119.876218-36711-90969665355110/AnsiballZ_setup.py 34589 1727204119.91679: Sending initial data 34589 1727204119.91690: Sent initial data (152 bytes) 34589 1727204119.92494: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204119.92546: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204119.92572: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204119.92617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204119.92734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204119.94484: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204119.94539: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204119.94637: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmpbrpvji0r /root/.ansible/tmp/ansible-tmp-1727204119.876218-36711-90969665355110/AnsiballZ_setup.py <<< 34589 1727204119.94659: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204119.876218-36711-90969665355110/AnsiballZ_setup.py" <<< 34589 1727204119.94764: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmpbrpvji0r" to remote "/root/.ansible/tmp/ansible-tmp-1727204119.876218-36711-90969665355110/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204119.876218-36711-90969665355110/AnsiballZ_setup.py" <<< 34589 1727204119.96756: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204119.96771: stderr chunk (state=3): >>><<< 34589 1727204119.96897: stdout chunk (state=3): >>><<< 34589 1727204119.96900: done transferring module to remote 34589 1727204119.96902: _low_level_execute_command(): starting 34589 1727204119.96905: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204119.876218-36711-90969665355110/ /root/.ansible/tmp/ansible-tmp-1727204119.876218-36711-90969665355110/AnsiballZ_setup.py && sleep 0' 34589 1727204119.97547: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204119.97567: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204119.97584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204119.97602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204119.97638: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204119.97698: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204119.97767: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204119.97790: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204119.97817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204119.97937: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204119.99998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204120.00004: stdout chunk (state=3): >>><<< 34589 1727204120.00007: stderr chunk (state=3): >>><<< 34589 1727204120.00025: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204120.00036: _low_level_execute_command(): starting 34589 1727204120.00129: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204119.876218-36711-90969665355110/AnsiballZ_setup.py && sleep 0' 34589 1727204120.00895: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204120.00935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204120.00939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204120.00971: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204120.01122: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204120.69803: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.6494140625, "5m": 0.5302734375, "15m": 0.28271484375}, "ansible_date_time": {"year": "2024", "month": "09", "<<< 34589 1727204120.69859: stdout chunk (state=3): >>>weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "20", "epoch": "1727204120", "epoch_int": "1727204120", "date": "2024-09-24", "time": "14:55:20", "iso8601_micro": "2024-09-24T18:55:20.304488Z", "iso8601": "2024-09-24T18:55:20Z", "iso8601_basic": "20240924T145520304488", "iso8601_basic_short": "20240924T145520", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_pkg_mgr": "dnf", "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["lo", "peerethtest0", "eth0", "ethtest0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "16:ab:3d:8e:44:05", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::14ab:3dff:fe8e:4405", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "12:9d:30:6d:a8:93", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::109d:30ff:fe6d:a893", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5", "fe80::14ab:3dff:fe8e:4405", "fe80::109d:30ff:fe6d:a893"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5", "fe80::109d:30ff:fe6d:a893", "fe80::14ab:3dff:fe8e:4405"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb":<<< 34589 1727204120.69879: stdout chunk (state=3): >>> 2909, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 622, "free": 2909}, "nocache": {"free": 3267, "used": 264}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 711, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785587712, "block_size": 4096, "block_total": 65519099, "block_available": 63912497, "block_used": 1606602, "inode_total": 131070960, "inode_available": 131027259, "inode_used": 43701, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 34589 1727204120.72243: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204120.72272: stderr chunk (state=3): >>><<< 34589 1727204120.72277: stdout chunk (state=3): >>><<< 34589 1727204120.72316: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.6494140625, "5m": 0.5302734375, "15m": 0.28271484375}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "20", "epoch": "1727204120", "epoch_int": "1727204120", "date": "2024-09-24", "time": "14:55:20", "iso8601_micro": "2024-09-24T18:55:20.304488Z", "iso8601": "2024-09-24T18:55:20Z", "iso8601_basic": "20240924T145520304488", "iso8601_basic_short": "20240924T145520", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_pkg_mgr": "dnf", "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["lo", "peerethtest0", "eth0", "ethtest0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "16:ab:3d:8e:44:05", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::14ab:3dff:fe8e:4405", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "12:9d:30:6d:a8:93", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::109d:30ff:fe6d:a893", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5", "fe80::14ab:3dff:fe8e:4405", "fe80::109d:30ff:fe6d:a893"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5", "fe80::109d:30ff:fe6d:a893", "fe80::14ab:3dff:fe8e:4405"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2909, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 622, "free": 2909}, "nocache": {"free": 3267, "used": 264}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 711, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785587712, "block_size": 4096, "block_total": 65519099, "block_available": 63912497, "block_used": 1606602, "inode_total": 131070960, "inode_available": 131027259, "inode_used": 43701, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204120.72704: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204119.876218-36711-90969665355110/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204120.72707: _low_level_execute_command(): starting 34589 1727204120.72709: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204119.876218-36711-90969665355110/ > /dev/null 2>&1 && sleep 0' 34589 1727204120.73337: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204120.73343: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204120.73346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204120.73391: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204120.73398: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204120.73400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204120.73480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204120.75465: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204120.75510: stderr chunk (state=3): >>><<< 34589 1727204120.75517: stdout chunk (state=3): >>><<< 34589 1727204120.75543: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204120.75546: handler run complete 34589 1727204120.75654: variable 'ansible_facts' from source: unknown 34589 1727204120.75770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204120.76009: variable 'ansible_facts' from source: unknown 34589 1727204120.76083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204120.76246: attempt loop complete, returning result 34589 1727204120.76249: _execute() done 34589 1727204120.76251: dumping result to json 34589 1727204120.76269: done dumping result, returning 34589 1727204120.76279: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [028d2410-947f-a9c6-cddc-000000000454] 34589 1727204120.76281: sending task result for task 028d2410-947f-a9c6-cddc-000000000454 34589 1727204120.76774: done sending task result for task 028d2410-947f-a9c6-cddc-000000000454 ok: [managed-node1] 34589 1727204120.77096: no more pending results, returning what we have 34589 1727204120.77101: results queue empty 34589 1727204120.77102: checking for any_errors_fatal 34589 1727204120.77103: done checking for any_errors_fatal 34589 1727204120.77104: checking for max_fail_percentage 34589 1727204120.77105: done checking for max_fail_percentage 34589 1727204120.77110: checking to see if all hosts have failed and the running result is not ok 34589 1727204120.77111: done checking to see if all hosts have failed 34589 1727204120.77111: getting the remaining hosts for this loop 34589 1727204120.77112: done getting the remaining hosts for this loop 34589 1727204120.77116: getting the next task for host managed-node1 34589 1727204120.77121: done getting next task for host managed-node1 34589 1727204120.77123: ^ task is: TASK: meta (flush_handlers) 34589 1727204120.77124: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204120.77129: getting variables 34589 1727204120.77130: in VariableManager get_vars() 34589 1727204120.77159: Calling all_inventory to load vars for managed-node1 34589 1727204120.77161: Calling groups_inventory to load vars for managed-node1 34589 1727204120.77163: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204120.77171: Calling all_plugins_play to load vars for managed-node1 34589 1727204120.77173: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204120.77177: Calling groups_plugins_play to load vars for managed-node1 34589 1727204120.77694: WORKER PROCESS EXITING 34589 1727204120.78084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204120.79211: done with get_vars() 34589 1727204120.79227: done getting variables 34589 1727204120.79279: in VariableManager get_vars() 34589 1727204120.79288: Calling all_inventory to load vars for managed-node1 34589 1727204120.79290: Calling groups_inventory to load vars for managed-node1 34589 1727204120.79291: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204120.79295: Calling all_plugins_play to load vars for managed-node1 34589 1727204120.79296: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204120.79298: Calling groups_plugins_play to load vars for managed-node1 34589 1727204120.80036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204120.81112: done with get_vars() 34589 1727204120.81133: done queuing things up, now waiting for results queue to drain 34589 1727204120.81134: results queue empty 34589 1727204120.81135: checking for any_errors_fatal 34589 1727204120.81137: done checking for any_errors_fatal 34589 1727204120.81137: checking for max_fail_percentage 34589 1727204120.81138: done checking for max_fail_percentage 34589 1727204120.81143: checking to see if all hosts have failed and the running result is not ok 34589 1727204120.81144: done checking to see if all hosts have failed 34589 1727204120.81144: getting the remaining hosts for this loop 34589 1727204120.81145: done getting the remaining hosts for this loop 34589 1727204120.81147: getting the next task for host managed-node1 34589 1727204120.81150: done getting next task for host managed-node1 34589 1727204120.81152: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34589 1727204120.81153: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204120.81159: getting variables 34589 1727204120.81160: in VariableManager get_vars() 34589 1727204120.81169: Calling all_inventory to load vars for managed-node1 34589 1727204120.81170: Calling groups_inventory to load vars for managed-node1 34589 1727204120.81171: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204120.81175: Calling all_plugins_play to load vars for managed-node1 34589 1727204120.81178: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204120.81180: Calling groups_plugins_play to load vars for managed-node1 34589 1727204120.81837: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204120.82730: done with get_vars() 34589 1727204120.82746: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:55:20 -0400 (0:00:01.020) 0:00:20.962 ***** 34589 1727204120.82798: entering _queue_task() for managed-node1/include_tasks 34589 1727204120.83081: worker is 1 (out of 1 available) 34589 1727204120.83094: exiting _queue_task() for managed-node1/include_tasks 34589 1727204120.83109: done queuing things up, now waiting for results queue to drain 34589 1727204120.83110: waiting for pending results... 34589 1727204120.83294: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34589 1727204120.83394: in run() - task 028d2410-947f-a9c6-cddc-000000000067 34589 1727204120.83410: variable 'ansible_search_path' from source: unknown 34589 1727204120.83414: variable 'ansible_search_path' from source: unknown 34589 1727204120.83438: calling self._execute() 34589 1727204120.83517: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204120.83521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204120.83529: variable 'omit' from source: magic vars 34589 1727204120.83862: variable 'ansible_distribution_major_version' from source: facts 34589 1727204120.83866: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204120.83969: variable 'connection_failed' from source: set_fact 34589 1727204120.83972: Evaluated conditional (not connection_failed): True 34589 1727204120.84057: variable 'ansible_distribution_major_version' from source: facts 34589 1727204120.84061: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204120.84142: variable 'connection_failed' from source: set_fact 34589 1727204120.84147: Evaluated conditional (not connection_failed): True 34589 1727204120.84153: _execute() done 34589 1727204120.84155: dumping result to json 34589 1727204120.84160: done dumping result, returning 34589 1727204120.84166: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [028d2410-947f-a9c6-cddc-000000000067] 34589 1727204120.84171: sending task result for task 028d2410-947f-a9c6-cddc-000000000067 34589 1727204120.84257: done sending task result for task 028d2410-947f-a9c6-cddc-000000000067 34589 1727204120.84260: WORKER PROCESS EXITING 34589 1727204120.84298: no more pending results, returning what we have 34589 1727204120.84304: in VariableManager get_vars() 34589 1727204120.84348: Calling all_inventory to load vars for managed-node1 34589 1727204120.84351: Calling groups_inventory to load vars for managed-node1 34589 1727204120.84353: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204120.84364: Calling all_plugins_play to load vars for managed-node1 34589 1727204120.84366: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204120.84369: Calling groups_plugins_play to load vars for managed-node1 34589 1727204120.85488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204120.86444: done with get_vars() 34589 1727204120.86459: variable 'ansible_search_path' from source: unknown 34589 1727204120.86461: variable 'ansible_search_path' from source: unknown 34589 1727204120.86483: we have included files to process 34589 1727204120.86484: generating all_blocks data 34589 1727204120.86485: done generating all_blocks data 34589 1727204120.86485: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 34589 1727204120.86486: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 34589 1727204120.86487: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 34589 1727204120.87103: done processing included file 34589 1727204120.87105: iterating over new_blocks loaded from include file 34589 1727204120.87108: in VariableManager get_vars() 34589 1727204120.87127: done with get_vars() 34589 1727204120.87129: filtering new block on tags 34589 1727204120.87151: done filtering new block on tags 34589 1727204120.87154: in VariableManager get_vars() 34589 1727204120.87186: done with get_vars() 34589 1727204120.87188: filtering new block on tags 34589 1727204120.87208: done filtering new block on tags 34589 1727204120.87211: in VariableManager get_vars() 34589 1727204120.87229: done with get_vars() 34589 1727204120.87231: filtering new block on tags 34589 1727204120.87246: done filtering new block on tags 34589 1727204120.87252: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 34589 1727204120.87258: extending task lists for all hosts with included blocks 34589 1727204120.87654: done extending task lists 34589 1727204120.87655: done processing included files 34589 1727204120.87656: results queue empty 34589 1727204120.87657: checking for any_errors_fatal 34589 1727204120.87658: done checking for any_errors_fatal 34589 1727204120.87659: checking for max_fail_percentage 34589 1727204120.87660: done checking for max_fail_percentage 34589 1727204120.87661: checking to see if all hosts have failed and the running result is not ok 34589 1727204120.87661: done checking to see if all hosts have failed 34589 1727204120.87662: getting the remaining hosts for this loop 34589 1727204120.87663: done getting the remaining hosts for this loop 34589 1727204120.87665: getting the next task for host managed-node1 34589 1727204120.87669: done getting next task for host managed-node1 34589 1727204120.87671: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 34589 1727204120.87673: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204120.87687: getting variables 34589 1727204120.87688: in VariableManager get_vars() 34589 1727204120.87703: Calling all_inventory to load vars for managed-node1 34589 1727204120.87706: Calling groups_inventory to load vars for managed-node1 34589 1727204120.87710: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204120.87716: Calling all_plugins_play to load vars for managed-node1 34589 1727204120.87718: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204120.87720: Calling groups_plugins_play to load vars for managed-node1 34589 1727204120.88788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204120.89783: done with get_vars() 34589 1727204120.89799: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:55:20 -0400 (0:00:00.070) 0:00:21.033 ***** 34589 1727204120.89856: entering _queue_task() for managed-node1/setup 34589 1727204120.90194: worker is 1 (out of 1 available) 34589 1727204120.90209: exiting _queue_task() for managed-node1/setup 34589 1727204120.90221: done queuing things up, now waiting for results queue to drain 34589 1727204120.90222: waiting for pending results... 34589 1727204120.90534: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 34589 1727204120.90634: in run() - task 028d2410-947f-a9c6-cddc-000000000495 34589 1727204120.90819: variable 'ansible_search_path' from source: unknown 34589 1727204120.90822: variable 'ansible_search_path' from source: unknown 34589 1727204120.90851: calling self._execute() 34589 1727204120.91059: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204120.91072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204120.91090: variable 'omit' from source: magic vars 34589 1727204120.91462: variable 'ansible_distribution_major_version' from source: facts 34589 1727204120.91530: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204120.91639: variable 'connection_failed' from source: set_fact 34589 1727204120.91649: Evaluated conditional (not connection_failed): True 34589 1727204120.91756: variable 'ansible_distribution_major_version' from source: facts 34589 1727204120.91767: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204120.91861: variable 'connection_failed' from source: set_fact 34589 1727204120.91871: Evaluated conditional (not connection_failed): True 34589 1727204120.92181: variable 'ansible_distribution_major_version' from source: facts 34589 1727204120.92184: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204120.92187: variable 'connection_failed' from source: set_fact 34589 1727204120.92195: Evaluated conditional (not connection_failed): True 34589 1727204120.92197: variable 'ansible_distribution_major_version' from source: facts 34589 1727204120.92430: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204120.92547: variable 'connection_failed' from source: set_fact 34589 1727204120.92560: Evaluated conditional (not connection_failed): True 34589 1727204120.92864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204120.97171: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204120.97246: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204120.97370: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204120.97412: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204120.97463: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204120.97553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204120.97590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204120.97625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204120.97673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204120.97697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204120.97818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204120.97847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204120.97885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204120.97936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204120.97954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204120.98278: variable '__network_required_facts' from source: role '' defaults 34589 1727204120.98290: variable 'ansible_facts' from source: unknown 34589 1727204120.99055: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 34589 1727204120.99062: when evaluation is False, skipping this task 34589 1727204120.99068: _execute() done 34589 1727204120.99073: dumping result to json 34589 1727204120.99081: done dumping result, returning 34589 1727204120.99091: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [028d2410-947f-a9c6-cddc-000000000495] 34589 1727204120.99099: sending task result for task 028d2410-947f-a9c6-cddc-000000000495 skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34589 1727204120.99352: no more pending results, returning what we have 34589 1727204120.99357: results queue empty 34589 1727204120.99358: checking for any_errors_fatal 34589 1727204120.99360: done checking for any_errors_fatal 34589 1727204120.99360: checking for max_fail_percentage 34589 1727204120.99362: done checking for max_fail_percentage 34589 1727204120.99363: checking to see if all hosts have failed and the running result is not ok 34589 1727204120.99364: done checking to see if all hosts have failed 34589 1727204120.99369: getting the remaining hosts for this loop 34589 1727204120.99371: done getting the remaining hosts for this loop 34589 1727204120.99377: getting the next task for host managed-node1 34589 1727204120.99387: done getting next task for host managed-node1 34589 1727204120.99391: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 34589 1727204120.99394: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204120.99412: getting variables 34589 1727204120.99414: in VariableManager get_vars() 34589 1727204120.99454: Calling all_inventory to load vars for managed-node1 34589 1727204120.99457: Calling groups_inventory to load vars for managed-node1 34589 1727204120.99460: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204120.99593: Calling all_plugins_play to load vars for managed-node1 34589 1727204120.99598: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204120.99602: Calling groups_plugins_play to load vars for managed-node1 34589 1727204121.00215: done sending task result for task 028d2410-947f-a9c6-cddc-000000000495 34589 1727204121.00218: WORKER PROCESS EXITING 34589 1727204121.02398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204121.05208: done with get_vars() 34589 1727204121.05240: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:55:21 -0400 (0:00:00.154) 0:00:21.188 ***** 34589 1727204121.05351: entering _queue_task() for managed-node1/stat 34589 1727204121.05841: worker is 1 (out of 1 available) 34589 1727204121.05851: exiting _queue_task() for managed-node1/stat 34589 1727204121.05863: done queuing things up, now waiting for results queue to drain 34589 1727204121.05864: waiting for pending results... 34589 1727204121.06064: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 34589 1727204121.06212: in run() - task 028d2410-947f-a9c6-cddc-000000000497 34589 1727204121.06235: variable 'ansible_search_path' from source: unknown 34589 1727204121.06242: variable 'ansible_search_path' from source: unknown 34589 1727204121.06286: calling self._execute() 34589 1727204121.06397: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204121.06412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204121.06427: variable 'omit' from source: magic vars 34589 1727204121.06838: variable 'ansible_distribution_major_version' from source: facts 34589 1727204121.06860: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204121.06980: variable 'connection_failed' from source: set_fact 34589 1727204121.06992: Evaluated conditional (not connection_failed): True 34589 1727204121.07134: variable 'ansible_distribution_major_version' from source: facts 34589 1727204121.07137: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204121.07230: variable 'connection_failed' from source: set_fact 34589 1727204121.07350: Evaluated conditional (not connection_failed): True 34589 1727204121.07353: variable 'ansible_distribution_major_version' from source: facts 34589 1727204121.07355: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204121.07439: variable 'connection_failed' from source: set_fact 34589 1727204121.07449: Evaluated conditional (not connection_failed): True 34589 1727204121.07573: variable 'ansible_distribution_major_version' from source: facts 34589 1727204121.07980: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204121.07983: variable 'connection_failed' from source: set_fact 34589 1727204121.07987: Evaluated conditional (not connection_failed): True 34589 1727204121.08166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204121.08693: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204121.08809: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204121.08910: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204121.09019: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204121.09303: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204121.09326: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204121.09359: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204121.09395: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204121.09546: variable '__network_is_ostree' from source: set_fact 34589 1727204121.09557: Evaluated conditional (not __network_is_ostree is defined): False 34589 1727204121.09564: when evaluation is False, skipping this task 34589 1727204121.09572: _execute() done 34589 1727204121.09582: dumping result to json 34589 1727204121.09590: done dumping result, returning 34589 1727204121.09600: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [028d2410-947f-a9c6-cddc-000000000497] 34589 1727204121.09614: sending task result for task 028d2410-947f-a9c6-cddc-000000000497 skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 34589 1727204121.09793: no more pending results, returning what we have 34589 1727204121.09797: results queue empty 34589 1727204121.09798: checking for any_errors_fatal 34589 1727204121.09810: done checking for any_errors_fatal 34589 1727204121.09811: checking for max_fail_percentage 34589 1727204121.09813: done checking for max_fail_percentage 34589 1727204121.09814: checking to see if all hosts have failed and the running result is not ok 34589 1727204121.09814: done checking to see if all hosts have failed 34589 1727204121.09815: getting the remaining hosts for this loop 34589 1727204121.09816: done getting the remaining hosts for this loop 34589 1727204121.09820: getting the next task for host managed-node1 34589 1727204121.09827: done getting next task for host managed-node1 34589 1727204121.09831: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 34589 1727204121.09835: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204121.09855: getting variables 34589 1727204121.09857: in VariableManager get_vars() 34589 1727204121.09903: Calling all_inventory to load vars for managed-node1 34589 1727204121.09909: Calling groups_inventory to load vars for managed-node1 34589 1727204121.09912: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204121.09924: Calling all_plugins_play to load vars for managed-node1 34589 1727204121.09927: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204121.09930: Calling groups_plugins_play to load vars for managed-node1 34589 1727204121.10656: done sending task result for task 028d2410-947f-a9c6-cddc-000000000497 34589 1727204121.10660: WORKER PROCESS EXITING 34589 1727204121.11686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204121.14244: done with get_vars() 34589 1727204121.14271: done getting variables 34589 1727204121.14345: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:55:21 -0400 (0:00:00.090) 0:00:21.278 ***** 34589 1727204121.14381: entering _queue_task() for managed-node1/set_fact 34589 1727204121.14870: worker is 1 (out of 1 available) 34589 1727204121.14886: exiting _queue_task() for managed-node1/set_fact 34589 1727204121.14898: done queuing things up, now waiting for results queue to drain 34589 1727204121.14899: waiting for pending results... 34589 1727204121.15126: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 34589 1727204121.15284: in run() - task 028d2410-947f-a9c6-cddc-000000000498 34589 1727204121.15306: variable 'ansible_search_path' from source: unknown 34589 1727204121.15318: variable 'ansible_search_path' from source: unknown 34589 1727204121.15361: calling self._execute() 34589 1727204121.15469: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204121.15487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204121.15503: variable 'omit' from source: magic vars 34589 1727204121.15897: variable 'ansible_distribution_major_version' from source: facts 34589 1727204121.15923: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204121.16037: variable 'connection_failed' from source: set_fact 34589 1727204121.16046: Evaluated conditional (not connection_failed): True 34589 1727204121.16153: variable 'ansible_distribution_major_version' from source: facts 34589 1727204121.16162: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204121.16271: variable 'connection_failed' from source: set_fact 34589 1727204121.16283: Evaluated conditional (not connection_failed): True 34589 1727204121.16415: variable 'ansible_distribution_major_version' from source: facts 34589 1727204121.16457: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204121.16563: variable 'connection_failed' from source: set_fact 34589 1727204121.16687: Evaluated conditional (not connection_failed): True 34589 1727204121.16755: variable 'ansible_distribution_major_version' from source: facts 34589 1727204121.16765: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204121.16869: variable 'connection_failed' from source: set_fact 34589 1727204121.16882: Evaluated conditional (not connection_failed): True 34589 1727204121.17057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204121.17353: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204121.17404: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204121.17457: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204121.17498: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204121.17652: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204121.17690: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204121.17723: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204121.17760: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204121.17979: variable '__network_is_ostree' from source: set_fact 34589 1727204121.18000: Evaluated conditional (not __network_is_ostree is defined): False 34589 1727204121.18002: when evaluation is False, skipping this task 34589 1727204121.18004: _execute() done 34589 1727204121.18009: dumping result to json 34589 1727204121.18012: done dumping result, returning 34589 1727204121.18014: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [028d2410-947f-a9c6-cddc-000000000498] 34589 1727204121.18016: sending task result for task 028d2410-947f-a9c6-cddc-000000000498 34589 1727204121.18084: done sending task result for task 028d2410-947f-a9c6-cddc-000000000498 34589 1727204121.18088: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 34589 1727204121.18163: no more pending results, returning what we have 34589 1727204121.18167: results queue empty 34589 1727204121.18168: checking for any_errors_fatal 34589 1727204121.18177: done checking for any_errors_fatal 34589 1727204121.18178: checking for max_fail_percentage 34589 1727204121.18180: done checking for max_fail_percentage 34589 1727204121.18181: checking to see if all hosts have failed and the running result is not ok 34589 1727204121.18182: done checking to see if all hosts have failed 34589 1727204121.18182: getting the remaining hosts for this loop 34589 1727204121.18184: done getting the remaining hosts for this loop 34589 1727204121.18188: getting the next task for host managed-node1 34589 1727204121.18198: done getting next task for host managed-node1 34589 1727204121.18202: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 34589 1727204121.18205: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204121.18342: getting variables 34589 1727204121.18345: in VariableManager get_vars() 34589 1727204121.18391: Calling all_inventory to load vars for managed-node1 34589 1727204121.18395: Calling groups_inventory to load vars for managed-node1 34589 1727204121.18397: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204121.18410: Calling all_plugins_play to load vars for managed-node1 34589 1727204121.18413: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204121.18416: Calling groups_plugins_play to load vars for managed-node1 34589 1727204121.19802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204121.21484: done with get_vars() 34589 1727204121.21519: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:55:21 -0400 (0:00:00.072) 0:00:21.351 ***** 34589 1727204121.21620: entering _queue_task() for managed-node1/service_facts 34589 1727204121.22210: worker is 1 (out of 1 available) 34589 1727204121.22221: exiting _queue_task() for managed-node1/service_facts 34589 1727204121.22232: done queuing things up, now waiting for results queue to drain 34589 1727204121.22233: waiting for pending results... 34589 1727204121.22473: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 34589 1727204121.22481: in run() - task 028d2410-947f-a9c6-cddc-00000000049a 34589 1727204121.22484: variable 'ansible_search_path' from source: unknown 34589 1727204121.22487: variable 'ansible_search_path' from source: unknown 34589 1727204121.22527: calling self._execute() 34589 1727204121.22641: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204121.22652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204121.22680: variable 'omit' from source: magic vars 34589 1727204121.23160: variable 'ansible_distribution_major_version' from source: facts 34589 1727204121.23217: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204121.23360: variable 'connection_failed' from source: set_fact 34589 1727204121.23442: Evaluated conditional (not connection_failed): True 34589 1727204121.23626: variable 'ansible_distribution_major_version' from source: facts 34589 1727204121.23688: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204121.23916: variable 'connection_failed' from source: set_fact 34589 1727204121.23926: Evaluated conditional (not connection_failed): True 34589 1727204121.24241: variable 'ansible_distribution_major_version' from source: facts 34589 1727204121.24245: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204121.24366: variable 'connection_failed' from source: set_fact 34589 1727204121.24418: Evaluated conditional (not connection_failed): True 34589 1727204121.24651: variable 'ansible_distribution_major_version' from source: facts 34589 1727204121.24661: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204121.24880: variable 'connection_failed' from source: set_fact 34589 1727204121.24888: Evaluated conditional (not connection_failed): True 34589 1727204121.24891: variable 'omit' from source: magic vars 34589 1727204121.25110: variable 'omit' from source: magic vars 34589 1727204121.25136: variable 'omit' from source: magic vars 34589 1727204121.25324: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204121.25433: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204121.25437: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204121.25446: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204121.25463: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204121.25531: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204121.25606: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204121.25612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204121.25868: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204121.25871: Set connection var ansible_shell_executable to /bin/sh 34589 1727204121.25881: Set connection var ansible_timeout to 10 34589 1727204121.25888: Set connection var ansible_shell_type to sh 34589 1727204121.25901: Set connection var ansible_connection to ssh 34589 1727204121.26083: Set connection var ansible_pipelining to False 34589 1727204121.26086: variable 'ansible_shell_executable' from source: unknown 34589 1727204121.26087: variable 'ansible_connection' from source: unknown 34589 1727204121.26089: variable 'ansible_module_compression' from source: unknown 34589 1727204121.26091: variable 'ansible_shell_type' from source: unknown 34589 1727204121.26093: variable 'ansible_shell_executable' from source: unknown 34589 1727204121.26095: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204121.26096: variable 'ansible_pipelining' from source: unknown 34589 1727204121.26098: variable 'ansible_timeout' from source: unknown 34589 1727204121.26100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204121.26502: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34589 1727204121.26528: variable 'omit' from source: magic vars 34589 1727204121.26538: starting attempt loop 34589 1727204121.26544: running the handler 34589 1727204121.26562: _low_level_execute_command(): starting 34589 1727204121.26574: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204121.27902: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204121.27906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204121.27992: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 34589 1727204121.28086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204121.28133: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204121.28288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204121.30043: stdout chunk (state=3): >>>/root <<< 34589 1727204121.30196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204121.30245: stderr chunk (state=3): >>><<< 34589 1727204121.30254: stdout chunk (state=3): >>><<< 34589 1727204121.30297: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204121.30586: _low_level_execute_command(): starting 34589 1727204121.30590: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204121.3034458-36771-84835120831813 `" && echo ansible-tmp-1727204121.3034458-36771-84835120831813="` echo /root/.ansible/tmp/ansible-tmp-1727204121.3034458-36771-84835120831813 `" ) && sleep 0' 34589 1727204121.31710: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204121.31713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204121.31716: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204121.31718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 34589 1727204121.31720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204121.31783: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204121.31795: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204121.31894: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204121.32018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204121.34192: stdout chunk (state=3): >>>ansible-tmp-1727204121.3034458-36771-84835120831813=/root/.ansible/tmp/ansible-tmp-1727204121.3034458-36771-84835120831813 <<< 34589 1727204121.34238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204121.34286: stderr chunk (state=3): >>><<< 34589 1727204121.34294: stdout chunk (state=3): >>><<< 34589 1727204121.34685: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204121.3034458-36771-84835120831813=/root/.ansible/tmp/ansible-tmp-1727204121.3034458-36771-84835120831813 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204121.34689: variable 'ansible_module_compression' from source: unknown 34589 1727204121.34691: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 34589 1727204121.34693: variable 'ansible_facts' from source: unknown 34589 1727204121.34834: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204121.3034458-36771-84835120831813/AnsiballZ_service_facts.py 34589 1727204121.35057: Sending initial data 34589 1727204121.35126: Sent initial data (161 bytes) 34589 1727204121.36392: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204121.36437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204121.36461: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204121.36479: stderr chunk (state=3): >>>debug2: match found <<< 34589 1727204121.36541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204121.36580: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204121.36603: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204121.36625: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204121.36761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204121.38503: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204121.38650: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204121.38723: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmp2l724nf6 /root/.ansible/tmp/ansible-tmp-1727204121.3034458-36771-84835120831813/AnsiballZ_service_facts.py <<< 34589 1727204121.38799: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204121.3034458-36771-84835120831813/AnsiballZ_service_facts.py" <<< 34589 1727204121.38836: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmp2l724nf6" to remote "/root/.ansible/tmp/ansible-tmp-1727204121.3034458-36771-84835120831813/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204121.3034458-36771-84835120831813/AnsiballZ_service_facts.py" <<< 34589 1727204121.39894: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204121.39972: stderr chunk (state=3): >>><<< 34589 1727204121.39990: stdout chunk (state=3): >>><<< 34589 1727204121.40017: done transferring module to remote 34589 1727204121.40035: _low_level_execute_command(): starting 34589 1727204121.40051: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204121.3034458-36771-84835120831813/ /root/.ansible/tmp/ansible-tmp-1727204121.3034458-36771-84835120831813/AnsiballZ_service_facts.py && sleep 0' 34589 1727204121.40671: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204121.40674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204121.40680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204121.40682: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204121.40684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204121.40745: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204121.40747: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204121.40826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204121.42772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204121.42809: stderr chunk (state=3): >>><<< 34589 1727204121.42811: stdout chunk (state=3): >>><<< 34589 1727204121.42881: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204121.42886: _low_level_execute_command(): starting 34589 1727204121.42888: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204121.3034458-36771-84835120831813/AnsiballZ_service_facts.py && sleep 0' 34589 1727204121.43449: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204121.43464: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204121.43484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204121.43638: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204123.20619: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 34589 1727204123.22367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204123.22377: stdout chunk (state=3): >>><<< 34589 1727204123.22396: stderr chunk (state=3): >>><<< 34589 1727204123.22414: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204123.23352: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204121.3034458-36771-84835120831813/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204123.23355: _low_level_execute_command(): starting 34589 1727204123.23358: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204121.3034458-36771-84835120831813/ > /dev/null 2>&1 && sleep 0' 34589 1727204123.24126: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204123.24195: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204123.24211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204123.24232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204123.24248: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204123.24391: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204123.24502: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204123.24611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204123.26641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204123.26652: stdout chunk (state=3): >>><<< 34589 1727204123.26709: stderr chunk (state=3): >>><<< 34589 1727204123.27083: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204123.27086: handler run complete 34589 1727204123.27306: variable 'ansible_facts' from source: unknown 34589 1727204123.27634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204123.28882: variable 'ansible_facts' from source: unknown 34589 1727204123.29159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204123.29694: attempt loop complete, returning result 34589 1727204123.29717: _execute() done 34589 1727204123.29729: dumping result to json 34589 1727204123.29800: done dumping result, returning 34589 1727204123.29817: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [028d2410-947f-a9c6-cddc-00000000049a] 34589 1727204123.29826: sending task result for task 028d2410-947f-a9c6-cddc-00000000049a ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34589 1727204123.31606: no more pending results, returning what we have 34589 1727204123.31611: results queue empty 34589 1727204123.31612: checking for any_errors_fatal 34589 1727204123.31616: done checking for any_errors_fatal 34589 1727204123.31617: checking for max_fail_percentage 34589 1727204123.31618: done checking for max_fail_percentage 34589 1727204123.31619: checking to see if all hosts have failed and the running result is not ok 34589 1727204123.31620: done checking to see if all hosts have failed 34589 1727204123.31621: getting the remaining hosts for this loop 34589 1727204123.31622: done getting the remaining hosts for this loop 34589 1727204123.31625: getting the next task for host managed-node1 34589 1727204123.31631: done getting next task for host managed-node1 34589 1727204123.31634: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 34589 1727204123.31637: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204123.31647: getting variables 34589 1727204123.31648: in VariableManager get_vars() 34589 1727204123.31797: Calling all_inventory to load vars for managed-node1 34589 1727204123.31801: Calling groups_inventory to load vars for managed-node1 34589 1727204123.31804: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204123.31812: done sending task result for task 028d2410-947f-a9c6-cddc-00000000049a 34589 1727204123.31815: WORKER PROCESS EXITING 34589 1727204123.31824: Calling all_plugins_play to load vars for managed-node1 34589 1727204123.31827: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204123.31829: Calling groups_plugins_play to load vars for managed-node1 34589 1727204123.35065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204123.37351: done with get_vars() 34589 1727204123.37381: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:55:23 -0400 (0:00:02.158) 0:00:23.509 ***** 34589 1727204123.37490: entering _queue_task() for managed-node1/package_facts 34589 1727204123.38140: worker is 1 (out of 1 available) 34589 1727204123.38149: exiting _queue_task() for managed-node1/package_facts 34589 1727204123.38163: done queuing things up, now waiting for results queue to drain 34589 1727204123.38164: waiting for pending results... 34589 1727204123.38464: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 34589 1727204123.39085: in run() - task 028d2410-947f-a9c6-cddc-00000000049b 34589 1727204123.39091: variable 'ansible_search_path' from source: unknown 34589 1727204123.39093: variable 'ansible_search_path' from source: unknown 34589 1727204123.39098: calling self._execute() 34589 1727204123.39102: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204123.39106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204123.39282: variable 'omit' from source: magic vars 34589 1727204123.39861: variable 'ansible_distribution_major_version' from source: facts 34589 1727204123.39888: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204123.40008: variable 'connection_failed' from source: set_fact 34589 1727204123.40020: Evaluated conditional (not connection_failed): True 34589 1727204123.40125: variable 'ansible_distribution_major_version' from source: facts 34589 1727204123.40147: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204123.40244: variable 'connection_failed' from source: set_fact 34589 1727204123.40256: Evaluated conditional (not connection_failed): True 34589 1727204123.40373: variable 'ansible_distribution_major_version' from source: facts 34589 1727204123.40386: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204123.40484: variable 'connection_failed' from source: set_fact 34589 1727204123.40495: Evaluated conditional (not connection_failed): True 34589 1727204123.40601: variable 'ansible_distribution_major_version' from source: facts 34589 1727204123.40611: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204123.40705: variable 'connection_failed' from source: set_fact 34589 1727204123.40716: Evaluated conditional (not connection_failed): True 34589 1727204123.40727: variable 'omit' from source: magic vars 34589 1727204123.40803: variable 'omit' from source: magic vars 34589 1727204123.40842: variable 'omit' from source: magic vars 34589 1727204123.40905: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204123.40960: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204123.40996: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204123.41012: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204123.41021: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204123.41044: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204123.41047: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204123.41050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204123.41135: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204123.41139: Set connection var ansible_shell_executable to /bin/sh 34589 1727204123.41146: Set connection var ansible_timeout to 10 34589 1727204123.41149: Set connection var ansible_shell_type to sh 34589 1727204123.41155: Set connection var ansible_connection to ssh 34589 1727204123.41160: Set connection var ansible_pipelining to False 34589 1727204123.41178: variable 'ansible_shell_executable' from source: unknown 34589 1727204123.41181: variable 'ansible_connection' from source: unknown 34589 1727204123.41183: variable 'ansible_module_compression' from source: unknown 34589 1727204123.41186: variable 'ansible_shell_type' from source: unknown 34589 1727204123.41188: variable 'ansible_shell_executable' from source: unknown 34589 1727204123.41191: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204123.41200: variable 'ansible_pipelining' from source: unknown 34589 1727204123.41202: variable 'ansible_timeout' from source: unknown 34589 1727204123.41205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204123.41341: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34589 1727204123.41349: variable 'omit' from source: magic vars 34589 1727204123.41354: starting attempt loop 34589 1727204123.41357: running the handler 34589 1727204123.41369: _low_level_execute_command(): starting 34589 1727204123.41377: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204123.41866: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204123.41872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204123.41876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204123.41932: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204123.41935: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204123.41941: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204123.42022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204123.43799: stdout chunk (state=3): >>>/root <<< 34589 1727204123.43883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204123.43920: stderr chunk (state=3): >>><<< 34589 1727204123.43924: stdout chunk (state=3): >>><<< 34589 1727204123.43958: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204123.43972: _low_level_execute_command(): starting 34589 1727204123.43987: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204123.4395804-36949-247469181790150 `" && echo ansible-tmp-1727204123.4395804-36949-247469181790150="` echo /root/.ansible/tmp/ansible-tmp-1727204123.4395804-36949-247469181790150 `" ) && sleep 0' 34589 1727204123.44445: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204123.44458: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204123.44543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204123.46641: stdout chunk (state=3): >>>ansible-tmp-1727204123.4395804-36949-247469181790150=/root/.ansible/tmp/ansible-tmp-1727204123.4395804-36949-247469181790150 <<< 34589 1727204123.46751: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204123.46774: stderr chunk (state=3): >>><<< 34589 1727204123.46780: stdout chunk (state=3): >>><<< 34589 1727204123.46792: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204123.4395804-36949-247469181790150=/root/.ansible/tmp/ansible-tmp-1727204123.4395804-36949-247469181790150 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204123.46832: variable 'ansible_module_compression' from source: unknown 34589 1727204123.46871: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 34589 1727204123.46922: variable 'ansible_facts' from source: unknown 34589 1727204123.47044: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204123.4395804-36949-247469181790150/AnsiballZ_package_facts.py 34589 1727204123.47148: Sending initial data 34589 1727204123.47151: Sent initial data (162 bytes) 34589 1727204123.47584: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204123.47587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204123.47589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204123.47592: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204123.47595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204123.47645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204123.47648: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204123.47732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204123.49478: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 34589 1727204123.49482: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204123.49550: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204123.49628: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmpf7qwhluh /root/.ansible/tmp/ansible-tmp-1727204123.4395804-36949-247469181790150/AnsiballZ_package_facts.py <<< 34589 1727204123.49631: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204123.4395804-36949-247469181790150/AnsiballZ_package_facts.py" <<< 34589 1727204123.49703: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmpf7qwhluh" to remote "/root/.ansible/tmp/ansible-tmp-1727204123.4395804-36949-247469181790150/AnsiballZ_package_facts.py" <<< 34589 1727204123.49708: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204123.4395804-36949-247469181790150/AnsiballZ_package_facts.py" <<< 34589 1727204123.50948: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204123.50961: stderr chunk (state=3): >>><<< 34589 1727204123.50964: stdout chunk (state=3): >>><<< 34589 1727204123.51005: done transferring module to remote 34589 1727204123.51016: _low_level_execute_command(): starting 34589 1727204123.51019: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204123.4395804-36949-247469181790150/ /root/.ansible/tmp/ansible-tmp-1727204123.4395804-36949-247469181790150/AnsiballZ_package_facts.py && sleep 0' 34589 1727204123.51445: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204123.51482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204123.51485: stderr chunk (state=3): >>>debug2: match not found <<< 34589 1727204123.51487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204123.51489: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204123.51491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 34589 1727204123.51495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204123.51539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204123.51543: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204123.51629: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204123.53588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204123.53613: stderr chunk (state=3): >>><<< 34589 1727204123.53616: stdout chunk (state=3): >>><<< 34589 1727204123.53633: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204123.53638: _low_level_execute_command(): starting 34589 1727204123.53641: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204123.4395804-36949-247469181790150/AnsiballZ_package_facts.py && sleep 0' 34589 1727204123.54085: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204123.54088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204123.54091: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204123.54093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 34589 1727204123.54095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204123.54145: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204123.54151: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204123.54154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204123.54235: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204124.01186: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 34589 1727204124.01204: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 34589 1727204124.01253: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 34589 1727204124.01268: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 34589 1727204124.01286: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source":<<< 34589 1727204124.01292: stdout chunk (state=3): >>> "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el1<<< 34589 1727204124.01341: stdout chunk (state=3): >>>0", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name":<<< 34589 1727204124.01358: stdout chunk (state=3): >>> "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch<<< 34589 1727204124.01365: stdout chunk (state=3): >>>": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch",<<< 34589 1727204124.01371: stdout chunk (state=3): >>> "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch<<< 34589 1727204124.01398: stdout chunk (state=3): >>>": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 34589 1727204124.03451: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204124.03487: stderr chunk (state=3): >>><<< 34589 1727204124.03490: stdout chunk (state=3): >>><<< 34589 1727204124.03534: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204124.05862: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204123.4395804-36949-247469181790150/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204124.05866: _low_level_execute_command(): starting 34589 1727204124.05869: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204123.4395804-36949-247469181790150/ > /dev/null 2>&1 && sleep 0' 34589 1727204124.06456: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204124.06491: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 34589 1727204124.06584: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204124.06609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204124.06730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204124.08725: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204124.08750: stderr chunk (state=3): >>><<< 34589 1727204124.08754: stdout chunk (state=3): >>><<< 34589 1727204124.08768: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204124.08783: handler run complete 34589 1727204124.09340: variable 'ansible_facts' from source: unknown 34589 1727204124.09634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204124.10997: variable 'ansible_facts' from source: unknown 34589 1727204124.11237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204124.11618: attempt loop complete, returning result 34589 1727204124.11627: _execute() done 34589 1727204124.11630: dumping result to json 34589 1727204124.11745: done dumping result, returning 34589 1727204124.11753: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [028d2410-947f-a9c6-cddc-00000000049b] 34589 1727204124.11756: sending task result for task 028d2410-947f-a9c6-cddc-00000000049b ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34589 1727204124.13082: done sending task result for task 028d2410-947f-a9c6-cddc-00000000049b 34589 1727204124.13088: WORKER PROCESS EXITING 34589 1727204124.13093: no more pending results, returning what we have 34589 1727204124.13095: results queue empty 34589 1727204124.13095: checking for any_errors_fatal 34589 1727204124.13099: done checking for any_errors_fatal 34589 1727204124.13100: checking for max_fail_percentage 34589 1727204124.13101: done checking for max_fail_percentage 34589 1727204124.13102: checking to see if all hosts have failed and the running result is not ok 34589 1727204124.13102: done checking to see if all hosts have failed 34589 1727204124.13102: getting the remaining hosts for this loop 34589 1727204124.13103: done getting the remaining hosts for this loop 34589 1727204124.13106: getting the next task for host managed-node1 34589 1727204124.13112: done getting next task for host managed-node1 34589 1727204124.13115: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 34589 1727204124.13116: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204124.13122: getting variables 34589 1727204124.13123: in VariableManager get_vars() 34589 1727204124.13146: Calling all_inventory to load vars for managed-node1 34589 1727204124.13148: Calling groups_inventory to load vars for managed-node1 34589 1727204124.13150: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204124.13156: Calling all_plugins_play to load vars for managed-node1 34589 1727204124.13158: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204124.13160: Calling groups_plugins_play to load vars for managed-node1 34589 1727204124.13885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204124.14784: done with get_vars() 34589 1727204124.14800: done getting variables 34589 1727204124.14846: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:55:24 -0400 (0:00:00.773) 0:00:24.283 ***** 34589 1727204124.14868: entering _queue_task() for managed-node1/debug 34589 1727204124.15136: worker is 1 (out of 1 available) 34589 1727204124.15160: exiting _queue_task() for managed-node1/debug 34589 1727204124.15173: done queuing things up, now waiting for results queue to drain 34589 1727204124.15174: waiting for pending results... 34589 1727204124.15347: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 34589 1727204124.15412: in run() - task 028d2410-947f-a9c6-cddc-000000000068 34589 1727204124.15425: variable 'ansible_search_path' from source: unknown 34589 1727204124.15429: variable 'ansible_search_path' from source: unknown 34589 1727204124.15457: calling self._execute() 34589 1727204124.15545: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204124.15550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204124.15558: variable 'omit' from source: magic vars 34589 1727204124.15836: variable 'ansible_distribution_major_version' from source: facts 34589 1727204124.15845: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204124.16081: variable 'connection_failed' from source: set_fact 34589 1727204124.16085: Evaluated conditional (not connection_failed): True 34589 1727204124.16088: variable 'ansible_distribution_major_version' from source: facts 34589 1727204124.16091: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204124.16193: variable 'connection_failed' from source: set_fact 34589 1727204124.16209: Evaluated conditional (not connection_failed): True 34589 1727204124.16223: variable 'omit' from source: magic vars 34589 1727204124.16267: variable 'omit' from source: magic vars 34589 1727204124.16369: variable 'network_provider' from source: set_fact 34589 1727204124.16394: variable 'omit' from source: magic vars 34589 1727204124.16441: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204124.16484: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204124.16510: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204124.16532: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204124.16550: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204124.16585: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204124.16594: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204124.16602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204124.16702: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204124.16779: Set connection var ansible_shell_executable to /bin/sh 34589 1727204124.16782: Set connection var ansible_timeout to 10 34589 1727204124.16785: Set connection var ansible_shell_type to sh 34589 1727204124.16786: Set connection var ansible_connection to ssh 34589 1727204124.16788: Set connection var ansible_pipelining to False 34589 1727204124.16790: variable 'ansible_shell_executable' from source: unknown 34589 1727204124.16792: variable 'ansible_connection' from source: unknown 34589 1727204124.16794: variable 'ansible_module_compression' from source: unknown 34589 1727204124.16796: variable 'ansible_shell_type' from source: unknown 34589 1727204124.16797: variable 'ansible_shell_executable' from source: unknown 34589 1727204124.16799: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204124.16801: variable 'ansible_pipelining' from source: unknown 34589 1727204124.16802: variable 'ansible_timeout' from source: unknown 34589 1727204124.16804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204124.16950: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204124.16965: variable 'omit' from source: magic vars 34589 1727204124.16973: starting attempt loop 34589 1727204124.16981: running the handler 34589 1727204124.17034: handler run complete 34589 1727204124.17066: attempt loop complete, returning result 34589 1727204124.17073: _execute() done 34589 1727204124.17084: dumping result to json 34589 1727204124.17181: done dumping result, returning 34589 1727204124.17184: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [028d2410-947f-a9c6-cddc-000000000068] 34589 1727204124.17186: sending task result for task 028d2410-947f-a9c6-cddc-000000000068 34589 1727204124.17254: done sending task result for task 028d2410-947f-a9c6-cddc-000000000068 34589 1727204124.17258: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: Using network provider: nm 34589 1727204124.17338: no more pending results, returning what we have 34589 1727204124.17341: results queue empty 34589 1727204124.17342: checking for any_errors_fatal 34589 1727204124.17352: done checking for any_errors_fatal 34589 1727204124.17353: checking for max_fail_percentage 34589 1727204124.17354: done checking for max_fail_percentage 34589 1727204124.17355: checking to see if all hosts have failed and the running result is not ok 34589 1727204124.17356: done checking to see if all hosts have failed 34589 1727204124.17356: getting the remaining hosts for this loop 34589 1727204124.17358: done getting the remaining hosts for this loop 34589 1727204124.17361: getting the next task for host managed-node1 34589 1727204124.17367: done getting next task for host managed-node1 34589 1727204124.17371: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34589 1727204124.17372: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204124.17382: getting variables 34589 1727204124.17385: in VariableManager get_vars() 34589 1727204124.17423: Calling all_inventory to load vars for managed-node1 34589 1727204124.17426: Calling groups_inventory to load vars for managed-node1 34589 1727204124.17429: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204124.17439: Calling all_plugins_play to load vars for managed-node1 34589 1727204124.17442: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204124.17444: Calling groups_plugins_play to load vars for managed-node1 34589 1727204124.18910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204124.20729: done with get_vars() 34589 1727204124.20753: done getting variables 34589 1727204124.20822: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:55:24 -0400 (0:00:00.059) 0:00:24.343 ***** 34589 1727204124.20856: entering _queue_task() for managed-node1/fail 34589 1727204124.21487: worker is 1 (out of 1 available) 34589 1727204124.21500: exiting _queue_task() for managed-node1/fail 34589 1727204124.21514: done queuing things up, now waiting for results queue to drain 34589 1727204124.21516: waiting for pending results... 34589 1727204124.21761: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34589 1727204124.21783: in run() - task 028d2410-947f-a9c6-cddc-000000000069 34589 1727204124.21806: variable 'ansible_search_path' from source: unknown 34589 1727204124.21824: variable 'ansible_search_path' from source: unknown 34589 1727204124.21879: calling self._execute() 34589 1727204124.22030: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204124.22074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204124.22079: variable 'omit' from source: magic vars 34589 1727204124.22626: variable 'ansible_distribution_major_version' from source: facts 34589 1727204124.22642: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204124.22756: variable 'connection_failed' from source: set_fact 34589 1727204124.22778: Evaluated conditional (not connection_failed): True 34589 1727204124.22873: variable 'ansible_distribution_major_version' from source: facts 34589 1727204124.22981: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204124.22994: variable 'connection_failed' from source: set_fact 34589 1727204124.23003: Evaluated conditional (not connection_failed): True 34589 1727204124.23131: variable 'network_state' from source: role '' defaults 34589 1727204124.23146: Evaluated conditional (network_state != {}): False 34589 1727204124.23153: when evaluation is False, skipping this task 34589 1727204124.23160: _execute() done 34589 1727204124.23166: dumping result to json 34589 1727204124.23173: done dumping result, returning 34589 1727204124.23186: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [028d2410-947f-a9c6-cddc-000000000069] 34589 1727204124.23195: sending task result for task 028d2410-947f-a9c6-cddc-000000000069 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34589 1727204124.23361: no more pending results, returning what we have 34589 1727204124.23365: results queue empty 34589 1727204124.23367: checking for any_errors_fatal 34589 1727204124.23378: done checking for any_errors_fatal 34589 1727204124.23379: checking for max_fail_percentage 34589 1727204124.23381: done checking for max_fail_percentage 34589 1727204124.23382: checking to see if all hosts have failed and the running result is not ok 34589 1727204124.23383: done checking to see if all hosts have failed 34589 1727204124.23383: getting the remaining hosts for this loop 34589 1727204124.23385: done getting the remaining hosts for this loop 34589 1727204124.23389: getting the next task for host managed-node1 34589 1727204124.23396: done getting next task for host managed-node1 34589 1727204124.23400: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34589 1727204124.23403: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204124.23421: getting variables 34589 1727204124.23424: in VariableManager get_vars() 34589 1727204124.23466: Calling all_inventory to load vars for managed-node1 34589 1727204124.23470: Calling groups_inventory to load vars for managed-node1 34589 1727204124.23473: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204124.23790: Calling all_plugins_play to load vars for managed-node1 34589 1727204124.23795: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204124.23798: Calling groups_plugins_play to load vars for managed-node1 34589 1727204124.29553: done sending task result for task 028d2410-947f-a9c6-cddc-000000000069 34589 1727204124.29558: WORKER PROCESS EXITING 34589 1727204124.30536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204124.32129: done with get_vars() 34589 1727204124.32147: done getting variables 34589 1727204124.32183: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:55:24 -0400 (0:00:00.113) 0:00:24.456 ***** 34589 1727204124.32204: entering _queue_task() for managed-node1/fail 34589 1727204124.32473: worker is 1 (out of 1 available) 34589 1727204124.32488: exiting _queue_task() for managed-node1/fail 34589 1727204124.32500: done queuing things up, now waiting for results queue to drain 34589 1727204124.32501: waiting for pending results... 34589 1727204124.32690: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34589 1727204124.32773: in run() - task 028d2410-947f-a9c6-cddc-00000000006a 34589 1727204124.32787: variable 'ansible_search_path' from source: unknown 34589 1727204124.32790: variable 'ansible_search_path' from source: unknown 34589 1727204124.32819: calling self._execute() 34589 1727204124.32896: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204124.32901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204124.32912: variable 'omit' from source: magic vars 34589 1727204124.33189: variable 'ansible_distribution_major_version' from source: facts 34589 1727204124.33199: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204124.33278: variable 'connection_failed' from source: set_fact 34589 1727204124.33281: Evaluated conditional (not connection_failed): True 34589 1727204124.33352: variable 'ansible_distribution_major_version' from source: facts 34589 1727204124.33356: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204124.33423: variable 'connection_failed' from source: set_fact 34589 1727204124.33426: Evaluated conditional (not connection_failed): True 34589 1727204124.33510: variable 'network_state' from source: role '' defaults 34589 1727204124.33514: Evaluated conditional (network_state != {}): False 34589 1727204124.33517: when evaluation is False, skipping this task 34589 1727204124.33519: _execute() done 34589 1727204124.33522: dumping result to json 34589 1727204124.33524: done dumping result, returning 34589 1727204124.33531: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [028d2410-947f-a9c6-cddc-00000000006a] 34589 1727204124.33535: sending task result for task 028d2410-947f-a9c6-cddc-00000000006a 34589 1727204124.33634: done sending task result for task 028d2410-947f-a9c6-cddc-00000000006a 34589 1727204124.33636: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34589 1727204124.33685: no more pending results, returning what we have 34589 1727204124.33688: results queue empty 34589 1727204124.33689: checking for any_errors_fatal 34589 1727204124.33703: done checking for any_errors_fatal 34589 1727204124.33703: checking for max_fail_percentage 34589 1727204124.33705: done checking for max_fail_percentage 34589 1727204124.33708: checking to see if all hosts have failed and the running result is not ok 34589 1727204124.33709: done checking to see if all hosts have failed 34589 1727204124.33709: getting the remaining hosts for this loop 34589 1727204124.33711: done getting the remaining hosts for this loop 34589 1727204124.33714: getting the next task for host managed-node1 34589 1727204124.33720: done getting next task for host managed-node1 34589 1727204124.33724: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34589 1727204124.33726: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204124.33740: getting variables 34589 1727204124.33742: in VariableManager get_vars() 34589 1727204124.33778: Calling all_inventory to load vars for managed-node1 34589 1727204124.33781: Calling groups_inventory to load vars for managed-node1 34589 1727204124.33784: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204124.33818: Calling all_plugins_play to load vars for managed-node1 34589 1727204124.33822: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204124.33826: Calling groups_plugins_play to load vars for managed-node1 34589 1727204124.35219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204124.36117: done with get_vars() 34589 1727204124.36132: done getting variables 34589 1727204124.36173: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:55:24 -0400 (0:00:00.039) 0:00:24.496 ***** 34589 1727204124.36198: entering _queue_task() for managed-node1/fail 34589 1727204124.36442: worker is 1 (out of 1 available) 34589 1727204124.36456: exiting _queue_task() for managed-node1/fail 34589 1727204124.36470: done queuing things up, now waiting for results queue to drain 34589 1727204124.36471: waiting for pending results... 34589 1727204124.36654: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34589 1727204124.36735: in run() - task 028d2410-947f-a9c6-cddc-00000000006b 34589 1727204124.36747: variable 'ansible_search_path' from source: unknown 34589 1727204124.36750: variable 'ansible_search_path' from source: unknown 34589 1727204124.36795: calling self._execute() 34589 1727204124.36981: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204124.36985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204124.36988: variable 'omit' from source: magic vars 34589 1727204124.37288: variable 'ansible_distribution_major_version' from source: facts 34589 1727204124.37305: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204124.37422: variable 'connection_failed' from source: set_fact 34589 1727204124.37435: Evaluated conditional (not connection_failed): True 34589 1727204124.37548: variable 'ansible_distribution_major_version' from source: facts 34589 1727204124.37559: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204124.37662: variable 'connection_failed' from source: set_fact 34589 1727204124.37674: Evaluated conditional (not connection_failed): True 34589 1727204124.37848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204124.40044: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204124.40125: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204124.40166: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204124.40381: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204124.40384: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204124.40387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204124.40390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204124.40393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204124.40425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204124.40444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204124.40546: variable 'ansible_distribution_major_version' from source: facts 34589 1727204124.40566: Evaluated conditional (ansible_distribution_major_version | int > 9): True 34589 1727204124.40700: variable 'ansible_distribution' from source: facts 34589 1727204124.40713: variable '__network_rh_distros' from source: role '' defaults 34589 1727204124.40725: Evaluated conditional (ansible_distribution in __network_rh_distros): True 34589 1727204124.40983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204124.41015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204124.41045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204124.41090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204124.41112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204124.41162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204124.41191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204124.41221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204124.41262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204124.41282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204124.41329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204124.41358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204124.41393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204124.41439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204124.41459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204124.41766: variable 'network_connections' from source: play vars 34589 1727204124.41787: variable 'profile' from source: play vars 34589 1727204124.41861: variable 'profile' from source: play vars 34589 1727204124.41872: variable 'interface' from source: set_fact 34589 1727204124.41941: variable 'interface' from source: set_fact 34589 1727204124.41957: variable 'network_state' from source: role '' defaults 34589 1727204124.42033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204124.42221: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204124.42380: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204124.42392: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204124.42395: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204124.42398: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204124.42412: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204124.42443: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204124.42474: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204124.42510: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 34589 1727204124.42519: when evaluation is False, skipping this task 34589 1727204124.42527: _execute() done 34589 1727204124.42534: dumping result to json 34589 1727204124.42541: done dumping result, returning 34589 1727204124.42554: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [028d2410-947f-a9c6-cddc-00000000006b] 34589 1727204124.42563: sending task result for task 028d2410-947f-a9c6-cddc-00000000006b skipping: [managed-node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 34589 1727204124.42713: no more pending results, returning what we have 34589 1727204124.42716: results queue empty 34589 1727204124.42716: checking for any_errors_fatal 34589 1727204124.42722: done checking for any_errors_fatal 34589 1727204124.42723: checking for max_fail_percentage 34589 1727204124.42724: done checking for max_fail_percentage 34589 1727204124.42725: checking to see if all hosts have failed and the running result is not ok 34589 1727204124.42726: done checking to see if all hosts have failed 34589 1727204124.42727: getting the remaining hosts for this loop 34589 1727204124.42728: done getting the remaining hosts for this loop 34589 1727204124.42731: getting the next task for host managed-node1 34589 1727204124.42737: done getting next task for host managed-node1 34589 1727204124.42740: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34589 1727204124.42742: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204124.42754: getting variables 34589 1727204124.42757: in VariableManager get_vars() 34589 1727204124.42794: Calling all_inventory to load vars for managed-node1 34589 1727204124.42797: Calling groups_inventory to load vars for managed-node1 34589 1727204124.42799: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204124.42808: Calling all_plugins_play to load vars for managed-node1 34589 1727204124.42810: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204124.42813: Calling groups_plugins_play to load vars for managed-node1 34589 1727204124.43391: done sending task result for task 028d2410-947f-a9c6-cddc-00000000006b 34589 1727204124.43394: WORKER PROCESS EXITING 34589 1727204124.44325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204124.45959: done with get_vars() 34589 1727204124.45986: done getting variables 34589 1727204124.46049: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:55:24 -0400 (0:00:00.098) 0:00:24.595 ***** 34589 1727204124.46081: entering _queue_task() for managed-node1/dnf 34589 1727204124.46434: worker is 1 (out of 1 available) 34589 1727204124.46447: exiting _queue_task() for managed-node1/dnf 34589 1727204124.46458: done queuing things up, now waiting for results queue to drain 34589 1727204124.46460: waiting for pending results... 34589 1727204124.46763: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34589 1727204124.46895: in run() - task 028d2410-947f-a9c6-cddc-00000000006c 34589 1727204124.46924: variable 'ansible_search_path' from source: unknown 34589 1727204124.46933: variable 'ansible_search_path' from source: unknown 34589 1727204124.46974: calling self._execute() 34589 1727204124.47090: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204124.47103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204124.47124: variable 'omit' from source: magic vars 34589 1727204124.47529: variable 'ansible_distribution_major_version' from source: facts 34589 1727204124.47545: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204124.47665: variable 'connection_failed' from source: set_fact 34589 1727204124.47677: Evaluated conditional (not connection_failed): True 34589 1727204124.47787: variable 'ansible_distribution_major_version' from source: facts 34589 1727204124.47798: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204124.47899: variable 'connection_failed' from source: set_fact 34589 1727204124.47913: Evaluated conditional (not connection_failed): True 34589 1727204124.48109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204124.50755: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204124.50831: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204124.50886: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204124.50933: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204124.50964: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204124.51055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204124.51091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204124.51281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204124.51285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204124.51287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204124.51314: variable 'ansible_distribution' from source: facts 34589 1727204124.51325: variable 'ansible_distribution_major_version' from source: facts 34589 1727204124.51346: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 34589 1727204124.51464: variable '__network_wireless_connections_defined' from source: role '' defaults 34589 1727204124.51604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204124.51640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204124.51670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204124.51719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204124.51744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204124.51792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204124.51825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204124.51859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204124.51905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204124.51926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204124.51972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204124.52001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204124.52031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204124.52080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204124.52100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204124.52259: variable 'network_connections' from source: play vars 34589 1727204124.52281: variable 'profile' from source: play vars 34589 1727204124.52356: variable 'profile' from source: play vars 34589 1727204124.52380: variable 'interface' from source: set_fact 34589 1727204124.52436: variable 'interface' from source: set_fact 34589 1727204124.52597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204124.52672: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204124.52726: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204124.52758: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204124.52817: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204124.52863: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204124.52896: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204124.52933: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204124.52962: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204124.53016: variable '__network_team_connections_defined' from source: role '' defaults 34589 1727204124.53261: variable 'network_connections' from source: play vars 34589 1727204124.53271: variable 'profile' from source: play vars 34589 1727204124.53332: variable 'profile' from source: play vars 34589 1727204124.53341: variable 'interface' from source: set_fact 34589 1727204124.53434: variable 'interface' from source: set_fact 34589 1727204124.53444: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 34589 1727204124.53452: when evaluation is False, skipping this task 34589 1727204124.53459: _execute() done 34589 1727204124.53680: dumping result to json 34589 1727204124.53684: done dumping result, returning 34589 1727204124.53686: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [028d2410-947f-a9c6-cddc-00000000006c] 34589 1727204124.53688: sending task result for task 028d2410-947f-a9c6-cddc-00000000006c 34589 1727204124.53765: done sending task result for task 028d2410-947f-a9c6-cddc-00000000006c 34589 1727204124.53767: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 34589 1727204124.53824: no more pending results, returning what we have 34589 1727204124.53827: results queue empty 34589 1727204124.53827: checking for any_errors_fatal 34589 1727204124.53834: done checking for any_errors_fatal 34589 1727204124.53834: checking for max_fail_percentage 34589 1727204124.53836: done checking for max_fail_percentage 34589 1727204124.53837: checking to see if all hosts have failed and the running result is not ok 34589 1727204124.53838: done checking to see if all hosts have failed 34589 1727204124.53839: getting the remaining hosts for this loop 34589 1727204124.53840: done getting the remaining hosts for this loop 34589 1727204124.53844: getting the next task for host managed-node1 34589 1727204124.53850: done getting next task for host managed-node1 34589 1727204124.53854: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34589 1727204124.53856: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204124.53869: getting variables 34589 1727204124.53871: in VariableManager get_vars() 34589 1727204124.53913: Calling all_inventory to load vars for managed-node1 34589 1727204124.53917: Calling groups_inventory to load vars for managed-node1 34589 1727204124.53919: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204124.53930: Calling all_plugins_play to load vars for managed-node1 34589 1727204124.53932: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204124.53935: Calling groups_plugins_play to load vars for managed-node1 34589 1727204124.55631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204124.56964: done with get_vars() 34589 1727204124.56983: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34589 1727204124.57051: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:55:24 -0400 (0:00:00.109) 0:00:24.705 ***** 34589 1727204124.57071: entering _queue_task() for managed-node1/yum 34589 1727204124.57331: worker is 1 (out of 1 available) 34589 1727204124.57345: exiting _queue_task() for managed-node1/yum 34589 1727204124.57358: done queuing things up, now waiting for results queue to drain 34589 1727204124.57360: waiting for pending results... 34589 1727204124.57639: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34589 1727204124.57804: in run() - task 028d2410-947f-a9c6-cddc-00000000006d 34589 1727204124.57811: variable 'ansible_search_path' from source: unknown 34589 1727204124.57814: variable 'ansible_search_path' from source: unknown 34589 1727204124.57916: calling self._execute() 34589 1727204124.57967: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204124.57984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204124.57998: variable 'omit' from source: magic vars 34589 1727204124.58928: variable 'ansible_distribution_major_version' from source: facts 34589 1727204124.59010: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204124.59326: variable 'connection_failed' from source: set_fact 34589 1727204124.59330: Evaluated conditional (not connection_failed): True 34589 1727204124.59646: variable 'ansible_distribution_major_version' from source: facts 34589 1727204124.59649: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204124.59702: variable 'connection_failed' from source: set_fact 34589 1727204124.59752: Evaluated conditional (not connection_failed): True 34589 1727204124.59881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204124.62648: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204124.62681: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204124.62731: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204124.62783: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204124.62823: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204124.62921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204124.62957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204124.63009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204124.63310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204124.63313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204124.63416: variable 'ansible_distribution_major_version' from source: facts 34589 1727204124.63438: Evaluated conditional (ansible_distribution_major_version | int < 8): False 34589 1727204124.63446: when evaluation is False, skipping this task 34589 1727204124.63453: _execute() done 34589 1727204124.63460: dumping result to json 34589 1727204124.63466: done dumping result, returning 34589 1727204124.63503: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [028d2410-947f-a9c6-cddc-00000000006d] 34589 1727204124.63508: sending task result for task 028d2410-947f-a9c6-cddc-00000000006d skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 34589 1727204124.63782: no more pending results, returning what we have 34589 1727204124.63786: results queue empty 34589 1727204124.63786: checking for any_errors_fatal 34589 1727204124.63792: done checking for any_errors_fatal 34589 1727204124.63793: checking for max_fail_percentage 34589 1727204124.63795: done checking for max_fail_percentage 34589 1727204124.63796: checking to see if all hosts have failed and the running result is not ok 34589 1727204124.63796: done checking to see if all hosts have failed 34589 1727204124.63797: getting the remaining hosts for this loop 34589 1727204124.63798: done getting the remaining hosts for this loop 34589 1727204124.63802: getting the next task for host managed-node1 34589 1727204124.63811: done getting next task for host managed-node1 34589 1727204124.63815: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34589 1727204124.63817: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204124.63831: getting variables 34589 1727204124.63832: in VariableManager get_vars() 34589 1727204124.63872: Calling all_inventory to load vars for managed-node1 34589 1727204124.63877: Calling groups_inventory to load vars for managed-node1 34589 1727204124.63880: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204124.63890: Calling all_plugins_play to load vars for managed-node1 34589 1727204124.63892: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204124.63895: Calling groups_plugins_play to load vars for managed-node1 34589 1727204124.64615: done sending task result for task 028d2410-947f-a9c6-cddc-00000000006d 34589 1727204124.64619: WORKER PROCESS EXITING 34589 1727204124.65641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204124.66528: done with get_vars() 34589 1727204124.66545: done getting variables 34589 1727204124.66594: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:55:24 -0400 (0:00:00.095) 0:00:24.801 ***** 34589 1727204124.66619: entering _queue_task() for managed-node1/fail 34589 1727204124.66868: worker is 1 (out of 1 available) 34589 1727204124.66883: exiting _queue_task() for managed-node1/fail 34589 1727204124.66897: done queuing things up, now waiting for results queue to drain 34589 1727204124.66898: waiting for pending results... 34589 1727204124.67073: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34589 1727204124.67152: in run() - task 028d2410-947f-a9c6-cddc-00000000006e 34589 1727204124.67164: variable 'ansible_search_path' from source: unknown 34589 1727204124.67168: variable 'ansible_search_path' from source: unknown 34589 1727204124.67203: calling self._execute() 34589 1727204124.67286: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204124.67290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204124.67298: variable 'omit' from source: magic vars 34589 1727204124.67591: variable 'ansible_distribution_major_version' from source: facts 34589 1727204124.67600: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204124.67781: variable 'connection_failed' from source: set_fact 34589 1727204124.67785: Evaluated conditional (not connection_failed): True 34589 1727204124.67797: variable 'ansible_distribution_major_version' from source: facts 34589 1727204124.67807: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204124.68029: variable 'connection_failed' from source: set_fact 34589 1727204124.68039: Evaluated conditional (not connection_failed): True 34589 1727204124.68154: variable '__network_wireless_connections_defined' from source: role '' defaults 34589 1727204124.68351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204124.70885: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204124.70964: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204124.71008: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204124.71047: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204124.71087: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204124.71171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204124.71213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204124.71246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204124.71294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204124.71315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204124.71381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204124.71399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204124.71480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204124.71487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204124.71494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204124.71542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204124.71572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204124.71610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204124.71654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204124.71679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204124.71864: variable 'network_connections' from source: play vars 34589 1727204124.71922: variable 'profile' from source: play vars 34589 1727204124.71971: variable 'profile' from source: play vars 34589 1727204124.71985: variable 'interface' from source: set_fact 34589 1727204124.72058: variable 'interface' from source: set_fact 34589 1727204124.72135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204124.72309: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204124.72336: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204124.72371: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204124.72394: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204124.72425: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204124.72444: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204124.72463: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204124.72485: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204124.72523: variable '__network_team_connections_defined' from source: role '' defaults 34589 1727204124.72675: variable 'network_connections' from source: play vars 34589 1727204124.72679: variable 'profile' from source: play vars 34589 1727204124.72726: variable 'profile' from source: play vars 34589 1727204124.72729: variable 'interface' from source: set_fact 34589 1727204124.72773: variable 'interface' from source: set_fact 34589 1727204124.72797: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 34589 1727204124.72801: when evaluation is False, skipping this task 34589 1727204124.72803: _execute() done 34589 1727204124.72805: dumping result to json 34589 1727204124.72808: done dumping result, returning 34589 1727204124.72813: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [028d2410-947f-a9c6-cddc-00000000006e] 34589 1727204124.72818: sending task result for task 028d2410-947f-a9c6-cddc-00000000006e 34589 1727204124.72912: done sending task result for task 028d2410-947f-a9c6-cddc-00000000006e 34589 1727204124.72915: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 34589 1727204124.72964: no more pending results, returning what we have 34589 1727204124.72967: results queue empty 34589 1727204124.72968: checking for any_errors_fatal 34589 1727204124.72974: done checking for any_errors_fatal 34589 1727204124.72974: checking for max_fail_percentage 34589 1727204124.72982: done checking for max_fail_percentage 34589 1727204124.72983: checking to see if all hosts have failed and the running result is not ok 34589 1727204124.72984: done checking to see if all hosts have failed 34589 1727204124.72989: getting the remaining hosts for this loop 34589 1727204124.72990: done getting the remaining hosts for this loop 34589 1727204124.72994: getting the next task for host managed-node1 34589 1727204124.73000: done getting next task for host managed-node1 34589 1727204124.73004: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 34589 1727204124.73006: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204124.73019: getting variables 34589 1727204124.73020: in VariableManager get_vars() 34589 1727204124.73057: Calling all_inventory to load vars for managed-node1 34589 1727204124.73060: Calling groups_inventory to load vars for managed-node1 34589 1727204124.73062: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204124.73071: Calling all_plugins_play to load vars for managed-node1 34589 1727204124.73073: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204124.73078: Calling groups_plugins_play to load vars for managed-node1 34589 1727204124.74011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204124.74890: done with get_vars() 34589 1727204124.74905: done getting variables 34589 1727204124.74950: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:55:24 -0400 (0:00:00.083) 0:00:24.884 ***** 34589 1727204124.74971: entering _queue_task() for managed-node1/package 34589 1727204124.75213: worker is 1 (out of 1 available) 34589 1727204124.75225: exiting _queue_task() for managed-node1/package 34589 1727204124.75238: done queuing things up, now waiting for results queue to drain 34589 1727204124.75240: waiting for pending results... 34589 1727204124.75416: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 34589 1727204124.75493: in run() - task 028d2410-947f-a9c6-cddc-00000000006f 34589 1727204124.75507: variable 'ansible_search_path' from source: unknown 34589 1727204124.75510: variable 'ansible_search_path' from source: unknown 34589 1727204124.75540: calling self._execute() 34589 1727204124.75622: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204124.75626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204124.75634: variable 'omit' from source: magic vars 34589 1727204124.75910: variable 'ansible_distribution_major_version' from source: facts 34589 1727204124.75921: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204124.75994: variable 'connection_failed' from source: set_fact 34589 1727204124.75998: Evaluated conditional (not connection_failed): True 34589 1727204124.76074: variable 'ansible_distribution_major_version' from source: facts 34589 1727204124.76080: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204124.76147: variable 'connection_failed' from source: set_fact 34589 1727204124.76150: Evaluated conditional (not connection_failed): True 34589 1727204124.76282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204124.76471: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204124.76503: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204124.76557: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204124.76585: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204124.76653: variable 'network_packages' from source: role '' defaults 34589 1727204124.76730: variable '__network_provider_setup' from source: role '' defaults 34589 1727204124.76738: variable '__network_service_name_default_nm' from source: role '' defaults 34589 1727204124.76790: variable '__network_service_name_default_nm' from source: role '' defaults 34589 1727204124.76797: variable '__network_packages_default_nm' from source: role '' defaults 34589 1727204124.76841: variable '__network_packages_default_nm' from source: role '' defaults 34589 1727204124.76955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204124.78269: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204124.78315: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204124.78342: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204124.78364: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204124.78394: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204124.78453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204124.78472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204124.78492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204124.78524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204124.78534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204124.78565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204124.78583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204124.78600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204124.78631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204124.78641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204124.78777: variable '__network_packages_default_gobject_packages' from source: role '' defaults 34589 1727204124.78848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204124.78864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204124.78882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204124.78906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204124.78919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204124.78979: variable 'ansible_python' from source: facts 34589 1727204124.78999: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 34589 1727204124.79059: variable '__network_wpa_supplicant_required' from source: role '' defaults 34589 1727204124.79114: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 34589 1727204124.79197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204124.79216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204124.79233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204124.79261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204124.79278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204124.79307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204124.79325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204124.79341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204124.79365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204124.79377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204124.79470: variable 'network_connections' from source: play vars 34589 1727204124.79474: variable 'profile' from source: play vars 34589 1727204124.79549: variable 'profile' from source: play vars 34589 1727204124.79554: variable 'interface' from source: set_fact 34589 1727204124.79607: variable 'interface' from source: set_fact 34589 1727204124.79656: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204124.79674: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204124.79697: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204124.79723: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204124.79756: variable '__network_wireless_connections_defined' from source: role '' defaults 34589 1727204124.79936: variable 'network_connections' from source: play vars 34589 1727204124.79939: variable 'profile' from source: play vars 34589 1727204124.80008: variable 'profile' from source: play vars 34589 1727204124.80016: variable 'interface' from source: set_fact 34589 1727204124.80066: variable 'interface' from source: set_fact 34589 1727204124.80092: variable '__network_packages_default_wireless' from source: role '' defaults 34589 1727204124.80144: variable '__network_wireless_connections_defined' from source: role '' defaults 34589 1727204124.80347: variable 'network_connections' from source: play vars 34589 1727204124.80351: variable 'profile' from source: play vars 34589 1727204124.80400: variable 'profile' from source: play vars 34589 1727204124.80403: variable 'interface' from source: set_fact 34589 1727204124.80471: variable 'interface' from source: set_fact 34589 1727204124.80492: variable '__network_packages_default_team' from source: role '' defaults 34589 1727204124.80544: variable '__network_team_connections_defined' from source: role '' defaults 34589 1727204124.80739: variable 'network_connections' from source: play vars 34589 1727204124.80743: variable 'profile' from source: play vars 34589 1727204124.80788: variable 'profile' from source: play vars 34589 1727204124.80791: variable 'interface' from source: set_fact 34589 1727204124.80862: variable 'interface' from source: set_fact 34589 1727204124.80899: variable '__network_service_name_default_initscripts' from source: role '' defaults 34589 1727204124.80943: variable '__network_service_name_default_initscripts' from source: role '' defaults 34589 1727204124.80949: variable '__network_packages_default_initscripts' from source: role '' defaults 34589 1727204124.80991: variable '__network_packages_default_initscripts' from source: role '' defaults 34589 1727204124.81127: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 34589 1727204124.81547: variable 'network_connections' from source: play vars 34589 1727204124.81551: variable 'profile' from source: play vars 34589 1727204124.81597: variable 'profile' from source: play vars 34589 1727204124.81600: variable 'interface' from source: set_fact 34589 1727204124.81644: variable 'interface' from source: set_fact 34589 1727204124.81651: variable 'ansible_distribution' from source: facts 34589 1727204124.81654: variable '__network_rh_distros' from source: role '' defaults 34589 1727204124.81660: variable 'ansible_distribution_major_version' from source: facts 34589 1727204124.81673: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 34589 1727204124.81780: variable 'ansible_distribution' from source: facts 34589 1727204124.81784: variable '__network_rh_distros' from source: role '' defaults 34589 1727204124.81787: variable 'ansible_distribution_major_version' from source: facts 34589 1727204124.81797: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 34589 1727204124.81904: variable 'ansible_distribution' from source: facts 34589 1727204124.81910: variable '__network_rh_distros' from source: role '' defaults 34589 1727204124.81913: variable 'ansible_distribution_major_version' from source: facts 34589 1727204124.81934: variable 'network_provider' from source: set_fact 34589 1727204124.81945: variable 'ansible_facts' from source: unknown 34589 1727204124.82354: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 34589 1727204124.82358: when evaluation is False, skipping this task 34589 1727204124.82360: _execute() done 34589 1727204124.82363: dumping result to json 34589 1727204124.82365: done dumping result, returning 34589 1727204124.82373: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [028d2410-947f-a9c6-cddc-00000000006f] 34589 1727204124.82377: sending task result for task 028d2410-947f-a9c6-cddc-00000000006f 34589 1727204124.82467: done sending task result for task 028d2410-947f-a9c6-cddc-00000000006f 34589 1727204124.82469: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 34589 1727204124.82521: no more pending results, returning what we have 34589 1727204124.82524: results queue empty 34589 1727204124.82525: checking for any_errors_fatal 34589 1727204124.82531: done checking for any_errors_fatal 34589 1727204124.82532: checking for max_fail_percentage 34589 1727204124.82533: done checking for max_fail_percentage 34589 1727204124.82534: checking to see if all hosts have failed and the running result is not ok 34589 1727204124.82535: done checking to see if all hosts have failed 34589 1727204124.82536: getting the remaining hosts for this loop 34589 1727204124.82537: done getting the remaining hosts for this loop 34589 1727204124.82540: getting the next task for host managed-node1 34589 1727204124.82547: done getting next task for host managed-node1 34589 1727204124.82551: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34589 1727204124.82557: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204124.82571: getting variables 34589 1727204124.82572: in VariableManager get_vars() 34589 1727204124.82623: Calling all_inventory to load vars for managed-node1 34589 1727204124.82626: Calling groups_inventory to load vars for managed-node1 34589 1727204124.82629: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204124.82638: Calling all_plugins_play to load vars for managed-node1 34589 1727204124.82640: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204124.82642: Calling groups_plugins_play to load vars for managed-node1 34589 1727204124.83450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204124.84409: done with get_vars() 34589 1727204124.84424: done getting variables 34589 1727204124.84466: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:55:24 -0400 (0:00:00.095) 0:00:24.979 ***** 34589 1727204124.84489: entering _queue_task() for managed-node1/package 34589 1727204124.84724: worker is 1 (out of 1 available) 34589 1727204124.84739: exiting _queue_task() for managed-node1/package 34589 1727204124.84751: done queuing things up, now waiting for results queue to drain 34589 1727204124.84753: waiting for pending results... 34589 1727204124.84934: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34589 1727204124.85005: in run() - task 028d2410-947f-a9c6-cddc-000000000070 34589 1727204124.85019: variable 'ansible_search_path' from source: unknown 34589 1727204124.85023: variable 'ansible_search_path' from source: unknown 34589 1727204124.85050: calling self._execute() 34589 1727204124.85132: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204124.85137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204124.85145: variable 'omit' from source: magic vars 34589 1727204124.85421: variable 'ansible_distribution_major_version' from source: facts 34589 1727204124.85430: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204124.85506: variable 'connection_failed' from source: set_fact 34589 1727204124.85513: Evaluated conditional (not connection_failed): True 34589 1727204124.85587: variable 'ansible_distribution_major_version' from source: facts 34589 1727204124.85590: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204124.85657: variable 'connection_failed' from source: set_fact 34589 1727204124.85660: Evaluated conditional (not connection_failed): True 34589 1727204124.85738: variable 'network_state' from source: role '' defaults 34589 1727204124.85749: Evaluated conditional (network_state != {}): False 34589 1727204124.85752: when evaluation is False, skipping this task 34589 1727204124.85755: _execute() done 34589 1727204124.85757: dumping result to json 34589 1727204124.85760: done dumping result, returning 34589 1727204124.85763: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [028d2410-947f-a9c6-cddc-000000000070] 34589 1727204124.85768: sending task result for task 028d2410-947f-a9c6-cddc-000000000070 34589 1727204124.85855: done sending task result for task 028d2410-947f-a9c6-cddc-000000000070 34589 1727204124.85858: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34589 1727204124.85907: no more pending results, returning what we have 34589 1727204124.85911: results queue empty 34589 1727204124.85912: checking for any_errors_fatal 34589 1727204124.85921: done checking for any_errors_fatal 34589 1727204124.85922: checking for max_fail_percentage 34589 1727204124.85923: done checking for max_fail_percentage 34589 1727204124.85924: checking to see if all hosts have failed and the running result is not ok 34589 1727204124.85925: done checking to see if all hosts have failed 34589 1727204124.85925: getting the remaining hosts for this loop 34589 1727204124.85927: done getting the remaining hosts for this loop 34589 1727204124.85931: getting the next task for host managed-node1 34589 1727204124.85936: done getting next task for host managed-node1 34589 1727204124.85940: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34589 1727204124.85942: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204124.85955: getting variables 34589 1727204124.85957: in VariableManager get_vars() 34589 1727204124.85991: Calling all_inventory to load vars for managed-node1 34589 1727204124.85994: Calling groups_inventory to load vars for managed-node1 34589 1727204124.85996: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204124.86005: Calling all_plugins_play to load vars for managed-node1 34589 1727204124.86007: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204124.86010: Calling groups_plugins_play to load vars for managed-node1 34589 1727204124.86781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204124.87678: done with get_vars() 34589 1727204124.87694: done getting variables 34589 1727204124.87739: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:55:24 -0400 (0:00:00.032) 0:00:25.012 ***** 34589 1727204124.87760: entering _queue_task() for managed-node1/package 34589 1727204124.88000: worker is 1 (out of 1 available) 34589 1727204124.88016: exiting _queue_task() for managed-node1/package 34589 1727204124.88028: done queuing things up, now waiting for results queue to drain 34589 1727204124.88030: waiting for pending results... 34589 1727204124.88205: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34589 1727204124.88280: in run() - task 028d2410-947f-a9c6-cddc-000000000071 34589 1727204124.88292: variable 'ansible_search_path' from source: unknown 34589 1727204124.88295: variable 'ansible_search_path' from source: unknown 34589 1727204124.88325: calling self._execute() 34589 1727204124.88403: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204124.88412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204124.88418: variable 'omit' from source: magic vars 34589 1727204124.88688: variable 'ansible_distribution_major_version' from source: facts 34589 1727204124.88702: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204124.88771: variable 'connection_failed' from source: set_fact 34589 1727204124.88774: Evaluated conditional (not connection_failed): True 34589 1727204124.88849: variable 'ansible_distribution_major_version' from source: facts 34589 1727204124.88853: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204124.88920: variable 'connection_failed' from source: set_fact 34589 1727204124.88924: Evaluated conditional (not connection_failed): True 34589 1727204124.89001: variable 'network_state' from source: role '' defaults 34589 1727204124.89011: Evaluated conditional (network_state != {}): False 34589 1727204124.89014: when evaluation is False, skipping this task 34589 1727204124.89017: _execute() done 34589 1727204124.89022: dumping result to json 34589 1727204124.89024: done dumping result, returning 34589 1727204124.89035: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [028d2410-947f-a9c6-cddc-000000000071] 34589 1727204124.89037: sending task result for task 028d2410-947f-a9c6-cddc-000000000071 34589 1727204124.89124: done sending task result for task 028d2410-947f-a9c6-cddc-000000000071 34589 1727204124.89127: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34589 1727204124.89187: no more pending results, returning what we have 34589 1727204124.89191: results queue empty 34589 1727204124.89191: checking for any_errors_fatal 34589 1727204124.89199: done checking for any_errors_fatal 34589 1727204124.89200: checking for max_fail_percentage 34589 1727204124.89202: done checking for max_fail_percentage 34589 1727204124.89203: checking to see if all hosts have failed and the running result is not ok 34589 1727204124.89203: done checking to see if all hosts have failed 34589 1727204124.89204: getting the remaining hosts for this loop 34589 1727204124.89205: done getting the remaining hosts for this loop 34589 1727204124.89211: getting the next task for host managed-node1 34589 1727204124.89216: done getting next task for host managed-node1 34589 1727204124.89220: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34589 1727204124.89222: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204124.89235: getting variables 34589 1727204124.89236: in VariableManager get_vars() 34589 1727204124.89267: Calling all_inventory to load vars for managed-node1 34589 1727204124.89270: Calling groups_inventory to load vars for managed-node1 34589 1727204124.89272: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204124.89281: Calling all_plugins_play to load vars for managed-node1 34589 1727204124.89284: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204124.89292: Calling groups_plugins_play to load vars for managed-node1 34589 1727204124.90206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204124.91085: done with get_vars() 34589 1727204124.91100: done getting variables 34589 1727204124.91146: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:55:24 -0400 (0:00:00.034) 0:00:25.046 ***** 34589 1727204124.91166: entering _queue_task() for managed-node1/service 34589 1727204124.91399: worker is 1 (out of 1 available) 34589 1727204124.91416: exiting _queue_task() for managed-node1/service 34589 1727204124.91428: done queuing things up, now waiting for results queue to drain 34589 1727204124.91430: waiting for pending results... 34589 1727204124.91598: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34589 1727204124.91666: in run() - task 028d2410-947f-a9c6-cddc-000000000072 34589 1727204124.91679: variable 'ansible_search_path' from source: unknown 34589 1727204124.91682: variable 'ansible_search_path' from source: unknown 34589 1727204124.91713: calling self._execute() 34589 1727204124.91788: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204124.91792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204124.91800: variable 'omit' from source: magic vars 34589 1727204124.92072: variable 'ansible_distribution_major_version' from source: facts 34589 1727204124.92083: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204124.92162: variable 'connection_failed' from source: set_fact 34589 1727204124.92166: Evaluated conditional (not connection_failed): True 34589 1727204124.92244: variable 'ansible_distribution_major_version' from source: facts 34589 1727204124.92247: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204124.92316: variable 'connection_failed' from source: set_fact 34589 1727204124.92320: Evaluated conditional (not connection_failed): True 34589 1727204124.92392: variable '__network_wireless_connections_defined' from source: role '' defaults 34589 1727204124.92523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204124.94088: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204124.94140: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204124.94168: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204124.94197: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204124.94217: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204124.94274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204124.94301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204124.94319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204124.94345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204124.94356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204124.94395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204124.94412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204124.94427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204124.94451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204124.94462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204124.94491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204124.94514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204124.94529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204124.94553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204124.94564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204124.94682: variable 'network_connections' from source: play vars 34589 1727204124.94692: variable 'profile' from source: play vars 34589 1727204124.94751: variable 'profile' from source: play vars 34589 1727204124.94754: variable 'interface' from source: set_fact 34589 1727204124.94799: variable 'interface' from source: set_fact 34589 1727204124.94852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204124.94967: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204124.94996: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204124.95018: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204124.95044: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204124.95071: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204124.95089: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204124.95110: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204124.95127: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204124.95166: variable '__network_team_connections_defined' from source: role '' defaults 34589 1727204124.95315: variable 'network_connections' from source: play vars 34589 1727204124.95318: variable 'profile' from source: play vars 34589 1727204124.95396: variable 'profile' from source: play vars 34589 1727204124.95400: variable 'interface' from source: set_fact 34589 1727204124.95681: variable 'interface' from source: set_fact 34589 1727204124.95684: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 34589 1727204124.95686: when evaluation is False, skipping this task 34589 1727204124.95688: _execute() done 34589 1727204124.95690: dumping result to json 34589 1727204124.95692: done dumping result, returning 34589 1727204124.95694: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [028d2410-947f-a9c6-cddc-000000000072] 34589 1727204124.95695: sending task result for task 028d2410-947f-a9c6-cddc-000000000072 34589 1727204124.95755: done sending task result for task 028d2410-947f-a9c6-cddc-000000000072 34589 1727204124.95758: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 34589 1727204124.95824: no more pending results, returning what we have 34589 1727204124.95827: results queue empty 34589 1727204124.95827: checking for any_errors_fatal 34589 1727204124.95833: done checking for any_errors_fatal 34589 1727204124.95834: checking for max_fail_percentage 34589 1727204124.95836: done checking for max_fail_percentage 34589 1727204124.95837: checking to see if all hosts have failed and the running result is not ok 34589 1727204124.95837: done checking to see if all hosts have failed 34589 1727204124.95838: getting the remaining hosts for this loop 34589 1727204124.95839: done getting the remaining hosts for this loop 34589 1727204124.95842: getting the next task for host managed-node1 34589 1727204124.95847: done getting next task for host managed-node1 34589 1727204124.95850: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34589 1727204124.95852: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204124.95864: getting variables 34589 1727204124.95865: in VariableManager get_vars() 34589 1727204124.95914: Calling all_inventory to load vars for managed-node1 34589 1727204124.95917: Calling groups_inventory to load vars for managed-node1 34589 1727204124.95919: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204124.95928: Calling all_plugins_play to load vars for managed-node1 34589 1727204124.95931: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204124.95934: Calling groups_plugins_play to load vars for managed-node1 34589 1727204124.97481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204124.99102: done with get_vars() 34589 1727204124.99131: done getting variables 34589 1727204124.99199: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:55:24 -0400 (0:00:00.080) 0:00:25.127 ***** 34589 1727204124.99233: entering _queue_task() for managed-node1/service 34589 1727204124.99636: worker is 1 (out of 1 available) 34589 1727204124.99650: exiting _queue_task() for managed-node1/service 34589 1727204124.99663: done queuing things up, now waiting for results queue to drain 34589 1727204124.99664: waiting for pending results... 34589 1727204125.00095: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34589 1727204125.00102: in run() - task 028d2410-947f-a9c6-cddc-000000000073 34589 1727204125.00220: variable 'ansible_search_path' from source: unknown 34589 1727204125.00225: variable 'ansible_search_path' from source: unknown 34589 1727204125.00228: calling self._execute() 34589 1727204125.00301: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204125.00348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204125.00351: variable 'omit' from source: magic vars 34589 1727204125.00802: variable 'ansible_distribution_major_version' from source: facts 34589 1727204125.00894: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204125.00960: variable 'connection_failed' from source: set_fact 34589 1727204125.00972: Evaluated conditional (not connection_failed): True 34589 1727204125.01122: variable 'ansible_distribution_major_version' from source: facts 34589 1727204125.01135: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204125.01245: variable 'connection_failed' from source: set_fact 34589 1727204125.01254: Evaluated conditional (not connection_failed): True 34589 1727204125.01443: variable 'network_provider' from source: set_fact 34589 1727204125.01581: variable 'network_state' from source: role '' defaults 34589 1727204125.01584: Evaluated conditional (network_provider == "nm" or network_state != {}): True 34589 1727204125.01586: variable 'omit' from source: magic vars 34589 1727204125.01588: variable 'omit' from source: magic vars 34589 1727204125.01590: variable 'network_service_name' from source: role '' defaults 34589 1727204125.01639: variable 'network_service_name' from source: role '' defaults 34589 1727204125.01753: variable '__network_provider_setup' from source: role '' defaults 34589 1727204125.01764: variable '__network_service_name_default_nm' from source: role '' defaults 34589 1727204125.01837: variable '__network_service_name_default_nm' from source: role '' defaults 34589 1727204125.01851: variable '__network_packages_default_nm' from source: role '' defaults 34589 1727204125.01925: variable '__network_packages_default_nm' from source: role '' defaults 34589 1727204125.02254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204125.03937: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204125.03989: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204125.04018: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204125.04043: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204125.04061: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204125.04150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204125.04199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204125.04203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204125.04387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204125.04390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204125.04393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204125.04395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204125.04397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204125.04400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204125.04406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204125.04621: variable '__network_packages_default_gobject_packages' from source: role '' defaults 34589 1727204125.04740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204125.04762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204125.04786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204125.04824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204125.04838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204125.04955: variable 'ansible_python' from source: facts 34589 1727204125.04978: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 34589 1727204125.05052: variable '__network_wpa_supplicant_required' from source: role '' defaults 34589 1727204125.05128: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 34589 1727204125.05259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204125.05278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204125.05302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204125.05340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204125.05351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204125.05399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204125.05423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204125.05453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204125.05492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204125.05585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204125.05633: variable 'network_connections' from source: play vars 34589 1727204125.05639: variable 'profile' from source: play vars 34589 1727204125.05728: variable 'profile' from source: play vars 34589 1727204125.05733: variable 'interface' from source: set_fact 34589 1727204125.05804: variable 'interface' from source: set_fact 34589 1727204125.05911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204125.06095: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204125.06141: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204125.06184: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204125.06235: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204125.06286: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204125.06316: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204125.06352: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204125.06500: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204125.06569: variable '__network_wireless_connections_defined' from source: role '' defaults 34589 1727204125.07100: variable 'network_connections' from source: play vars 34589 1727204125.07121: variable 'profile' from source: play vars 34589 1727204125.07211: variable 'profile' from source: play vars 34589 1727204125.07227: variable 'interface' from source: set_fact 34589 1727204125.07295: variable 'interface' from source: set_fact 34589 1727204125.07382: variable '__network_packages_default_wireless' from source: role '' defaults 34589 1727204125.07439: variable '__network_wireless_connections_defined' from source: role '' defaults 34589 1727204125.07772: variable 'network_connections' from source: play vars 34589 1727204125.07973: variable 'profile' from source: play vars 34589 1727204125.07978: variable 'profile' from source: play vars 34589 1727204125.07980: variable 'interface' from source: set_fact 34589 1727204125.07982: variable 'interface' from source: set_fact 34589 1727204125.07984: variable '__network_packages_default_team' from source: role '' defaults 34589 1727204125.08042: variable '__network_team_connections_defined' from source: role '' defaults 34589 1727204125.08335: variable 'network_connections' from source: play vars 34589 1727204125.08344: variable 'profile' from source: play vars 34589 1727204125.08414: variable 'profile' from source: play vars 34589 1727204125.08431: variable 'interface' from source: set_fact 34589 1727204125.08506: variable 'interface' from source: set_fact 34589 1727204125.08581: variable '__network_service_name_default_initscripts' from source: role '' defaults 34589 1727204125.08653: variable '__network_service_name_default_initscripts' from source: role '' defaults 34589 1727204125.08667: variable '__network_packages_default_initscripts' from source: role '' defaults 34589 1727204125.08737: variable '__network_packages_default_initscripts' from source: role '' defaults 34589 1727204125.08954: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 34589 1727204125.09497: variable 'network_connections' from source: play vars 34589 1727204125.09518: variable 'profile' from source: play vars 34589 1727204125.09589: variable 'profile' from source: play vars 34589 1727204125.09597: variable 'interface' from source: set_fact 34589 1727204125.09673: variable 'interface' from source: set_fact 34589 1727204125.09689: variable 'ansible_distribution' from source: facts 34589 1727204125.09697: variable '__network_rh_distros' from source: role '' defaults 34589 1727204125.09705: variable 'ansible_distribution_major_version' from source: facts 34589 1727204125.09725: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 34589 1727204125.09903: variable 'ansible_distribution' from source: facts 34589 1727204125.09915: variable '__network_rh_distros' from source: role '' defaults 34589 1727204125.09924: variable 'ansible_distribution_major_version' from source: facts 34589 1727204125.09951: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 34589 1727204125.10170: variable 'ansible_distribution' from source: facts 34589 1727204125.10179: variable '__network_rh_distros' from source: role '' defaults 34589 1727204125.10182: variable 'ansible_distribution_major_version' from source: facts 34589 1727204125.10209: variable 'network_provider' from source: set_fact 34589 1727204125.10235: variable 'omit' from source: magic vars 34589 1727204125.10280: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204125.10314: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204125.10380: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204125.10385: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204125.10387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204125.10411: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204125.10420: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204125.10426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204125.10540: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204125.10580: Set connection var ansible_shell_executable to /bin/sh 34589 1727204125.10583: Set connection var ansible_timeout to 10 34589 1727204125.10585: Set connection var ansible_shell_type to sh 34589 1727204125.10587: Set connection var ansible_connection to ssh 34589 1727204125.10588: Set connection var ansible_pipelining to False 34589 1727204125.10625: variable 'ansible_shell_executable' from source: unknown 34589 1727204125.10659: variable 'ansible_connection' from source: unknown 34589 1727204125.10663: variable 'ansible_module_compression' from source: unknown 34589 1727204125.10665: variable 'ansible_shell_type' from source: unknown 34589 1727204125.10667: variable 'ansible_shell_executable' from source: unknown 34589 1727204125.10673: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204125.10726: variable 'ansible_pipelining' from source: unknown 34589 1727204125.10729: variable 'ansible_timeout' from source: unknown 34589 1727204125.10731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204125.10832: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204125.10851: variable 'omit' from source: magic vars 34589 1727204125.11037: starting attempt loop 34589 1727204125.11040: running the handler 34589 1727204125.11042: variable 'ansible_facts' from source: unknown 34589 1727204125.11794: _low_level_execute_command(): starting 34589 1727204125.11819: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204125.12617: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204125.12634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204125.12693: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204125.12816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204125.12939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204125.14851: stdout chunk (state=3): >>>/root <<< 34589 1727204125.14909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204125.14913: stdout chunk (state=3): >>><<< 34589 1727204125.14919: stderr chunk (state=3): >>><<< 34589 1727204125.14939: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204125.14955: _low_level_execute_command(): starting 34589 1727204125.14959: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204125.1493976-37173-167731261195048 `" && echo ansible-tmp-1727204125.1493976-37173-167731261195048="` echo /root/.ansible/tmp/ansible-tmp-1727204125.1493976-37173-167731261195048 `" ) && sleep 0' 34589 1727204125.16150: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204125.16169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204125.16366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204125.16370: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204125.16409: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204125.16503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204125.18607: stdout chunk (state=3): >>>ansible-tmp-1727204125.1493976-37173-167731261195048=/root/.ansible/tmp/ansible-tmp-1727204125.1493976-37173-167731261195048 <<< 34589 1727204125.18733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204125.18799: stderr chunk (state=3): >>><<< 34589 1727204125.18809: stdout chunk (state=3): >>><<< 34589 1727204125.18839: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204125.1493976-37173-167731261195048=/root/.ansible/tmp/ansible-tmp-1727204125.1493976-37173-167731261195048 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204125.18883: variable 'ansible_module_compression' from source: unknown 34589 1727204125.18970: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 34589 1727204125.19181: variable 'ansible_facts' from source: unknown 34589 1727204125.19286: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204125.1493976-37173-167731261195048/AnsiballZ_systemd.py 34589 1727204125.19425: Sending initial data 34589 1727204125.19526: Sent initial data (156 bytes) 34589 1727204125.20127: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204125.20141: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204125.20152: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 34589 1727204125.20164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204125.20257: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204125.20281: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204125.20389: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204125.22163: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204125.22240: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204125.22322: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmp65i23vjc /root/.ansible/tmp/ansible-tmp-1727204125.1493976-37173-167731261195048/AnsiballZ_systemd.py <<< 34589 1727204125.22325: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204125.1493976-37173-167731261195048/AnsiballZ_systemd.py" <<< 34589 1727204125.22399: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmp65i23vjc" to remote "/root/.ansible/tmp/ansible-tmp-1727204125.1493976-37173-167731261195048/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204125.1493976-37173-167731261195048/AnsiballZ_systemd.py" <<< 34589 1727204125.24060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204125.24070: stderr chunk (state=3): >>><<< 34589 1727204125.24080: stdout chunk (state=3): >>><<< 34589 1727204125.24120: done transferring module to remote 34589 1727204125.24148: _low_level_execute_command(): starting 34589 1727204125.24151: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204125.1493976-37173-167731261195048/ /root/.ansible/tmp/ansible-tmp-1727204125.1493976-37173-167731261195048/AnsiballZ_systemd.py && sleep 0' 34589 1727204125.24561: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204125.24567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204125.24601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204125.24606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204125.24665: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204125.24668: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204125.24765: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204125.26761: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204125.26765: stdout chunk (state=3): >>><<< 34589 1727204125.26767: stderr chunk (state=3): >>><<< 34589 1727204125.26881: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204125.26884: _low_level_execute_command(): starting 34589 1727204125.26887: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204125.1493976-37173-167731261195048/AnsiballZ_systemd.py && sleep 0' 34589 1727204125.27288: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204125.27291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204125.27297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204125.27338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204125.27350: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204125.27437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204125.58532: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10711040", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3297169408", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1477139000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 34589 1727204125.58553: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "networ<<< 34589 1727204125.58561: stdout chunk (state=3): >>>k-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 34589 1727204125.60852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204125.60878: stderr chunk (state=3): >>><<< 34589 1727204125.60882: stdout chunk (state=3): >>><<< 34589 1727204125.60898: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10711040", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3297169408", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1477139000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204125.61062: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204125.1493976-37173-167731261195048/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204125.61098: _low_level_execute_command(): starting 34589 1727204125.61102: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204125.1493976-37173-167731261195048/ > /dev/null 2>&1 && sleep 0' 34589 1727204125.61674: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204125.61693: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204125.61696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204125.61718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204125.61721: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204125.61754: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204125.61758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204125.61829: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204125.61832: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204125.61845: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204125.61973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204125.63978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204125.63998: stderr chunk (state=3): >>><<< 34589 1727204125.64001: stdout chunk (state=3): >>><<< 34589 1727204125.64016: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204125.64022: handler run complete 34589 1727204125.64059: attempt loop complete, returning result 34589 1727204125.64063: _execute() done 34589 1727204125.64066: dumping result to json 34589 1727204125.64083: done dumping result, returning 34589 1727204125.64092: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [028d2410-947f-a9c6-cddc-000000000073] 34589 1727204125.64094: sending task result for task 028d2410-947f-a9c6-cddc-000000000073 34589 1727204125.64365: done sending task result for task 028d2410-947f-a9c6-cddc-000000000073 34589 1727204125.64368: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34589 1727204125.64423: no more pending results, returning what we have 34589 1727204125.64426: results queue empty 34589 1727204125.64427: checking for any_errors_fatal 34589 1727204125.64434: done checking for any_errors_fatal 34589 1727204125.64435: checking for max_fail_percentage 34589 1727204125.64437: done checking for max_fail_percentage 34589 1727204125.64438: checking to see if all hosts have failed and the running result is not ok 34589 1727204125.64438: done checking to see if all hosts have failed 34589 1727204125.64439: getting the remaining hosts for this loop 34589 1727204125.64441: done getting the remaining hosts for this loop 34589 1727204125.64445: getting the next task for host managed-node1 34589 1727204125.64451: done getting next task for host managed-node1 34589 1727204125.64454: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34589 1727204125.64456: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204125.64466: getting variables 34589 1727204125.64467: in VariableManager get_vars() 34589 1727204125.64510: Calling all_inventory to load vars for managed-node1 34589 1727204125.64514: Calling groups_inventory to load vars for managed-node1 34589 1727204125.64516: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204125.64527: Calling all_plugins_play to load vars for managed-node1 34589 1727204125.64530: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204125.64533: Calling groups_plugins_play to load vars for managed-node1 34589 1727204125.65858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204125.66741: done with get_vars() 34589 1727204125.66759: done getting variables 34589 1727204125.66804: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:55:25 -0400 (0:00:00.675) 0:00:25.803 ***** 34589 1727204125.66827: entering _queue_task() for managed-node1/service 34589 1727204125.67079: worker is 1 (out of 1 available) 34589 1727204125.67094: exiting _queue_task() for managed-node1/service 34589 1727204125.67110: done queuing things up, now waiting for results queue to drain 34589 1727204125.67111: waiting for pending results... 34589 1727204125.67289: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34589 1727204125.67357: in run() - task 028d2410-947f-a9c6-cddc-000000000074 34589 1727204125.67369: variable 'ansible_search_path' from source: unknown 34589 1727204125.67372: variable 'ansible_search_path' from source: unknown 34589 1727204125.67402: calling self._execute() 34589 1727204125.67483: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204125.67487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204125.67495: variable 'omit' from source: magic vars 34589 1727204125.67768: variable 'ansible_distribution_major_version' from source: facts 34589 1727204125.67781: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204125.67855: variable 'connection_failed' from source: set_fact 34589 1727204125.67858: Evaluated conditional (not connection_failed): True 34589 1727204125.67935: variable 'ansible_distribution_major_version' from source: facts 34589 1727204125.67938: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204125.68005: variable 'connection_failed' from source: set_fact 34589 1727204125.68011: Evaluated conditional (not connection_failed): True 34589 1727204125.68084: variable 'network_provider' from source: set_fact 34589 1727204125.68089: Evaluated conditional (network_provider == "nm"): True 34589 1727204125.68152: variable '__network_wpa_supplicant_required' from source: role '' defaults 34589 1727204125.68216: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 34589 1727204125.68331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204125.69743: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204125.69788: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204125.69818: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204125.69845: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204125.69867: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204125.69944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204125.69969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204125.69989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204125.70018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204125.70028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204125.70060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204125.70084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204125.70101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204125.70127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204125.70137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204125.70165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204125.70189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204125.70205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204125.70231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204125.70242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204125.70345: variable 'network_connections' from source: play vars 34589 1727204125.70354: variable 'profile' from source: play vars 34589 1727204125.70411: variable 'profile' from source: play vars 34589 1727204125.70414: variable 'interface' from source: set_fact 34589 1727204125.70454: variable 'interface' from source: set_fact 34589 1727204125.70510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204125.70616: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204125.70643: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204125.70664: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204125.70687: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204125.70721: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204125.70737: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204125.70755: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204125.70772: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204125.70811: variable '__network_wireless_connections_defined' from source: role '' defaults 34589 1727204125.71084: variable 'network_connections' from source: play vars 34589 1727204125.71096: variable 'profile' from source: play vars 34589 1727204125.71135: variable 'profile' from source: play vars 34589 1727204125.71139: variable 'interface' from source: set_fact 34589 1727204125.71184: variable 'interface' from source: set_fact 34589 1727204125.71205: Evaluated conditional (__network_wpa_supplicant_required): False 34589 1727204125.71211: when evaluation is False, skipping this task 34589 1727204125.71213: _execute() done 34589 1727204125.71216: dumping result to json 34589 1727204125.71218: done dumping result, returning 34589 1727204125.71223: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [028d2410-947f-a9c6-cddc-000000000074] 34589 1727204125.71228: sending task result for task 028d2410-947f-a9c6-cddc-000000000074 34589 1727204125.71313: done sending task result for task 028d2410-947f-a9c6-cddc-000000000074 34589 1727204125.71315: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 34589 1727204125.71364: no more pending results, returning what we have 34589 1727204125.71367: results queue empty 34589 1727204125.71368: checking for any_errors_fatal 34589 1727204125.71390: done checking for any_errors_fatal 34589 1727204125.71391: checking for max_fail_percentage 34589 1727204125.71392: done checking for max_fail_percentage 34589 1727204125.71393: checking to see if all hosts have failed and the running result is not ok 34589 1727204125.71394: done checking to see if all hosts have failed 34589 1727204125.71394: getting the remaining hosts for this loop 34589 1727204125.71396: done getting the remaining hosts for this loop 34589 1727204125.71399: getting the next task for host managed-node1 34589 1727204125.71406: done getting next task for host managed-node1 34589 1727204125.71411: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 34589 1727204125.71413: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204125.71427: getting variables 34589 1727204125.71428: in VariableManager get_vars() 34589 1727204125.71464: Calling all_inventory to load vars for managed-node1 34589 1727204125.71467: Calling groups_inventory to load vars for managed-node1 34589 1727204125.71469: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204125.71484: Calling all_plugins_play to load vars for managed-node1 34589 1727204125.71487: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204125.71490: Calling groups_plugins_play to load vars for managed-node1 34589 1727204125.72432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204125.74030: done with get_vars() 34589 1727204125.74059: done getting variables 34589 1727204125.74127: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:55:25 -0400 (0:00:00.073) 0:00:25.876 ***** 34589 1727204125.74156: entering _queue_task() for managed-node1/service 34589 1727204125.74525: worker is 1 (out of 1 available) 34589 1727204125.74541: exiting _queue_task() for managed-node1/service 34589 1727204125.74555: done queuing things up, now waiting for results queue to drain 34589 1727204125.74556: waiting for pending results... 34589 1727204125.75009: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 34589 1727204125.75015: in run() - task 028d2410-947f-a9c6-cddc-000000000075 34589 1727204125.75019: variable 'ansible_search_path' from source: unknown 34589 1727204125.75082: variable 'ansible_search_path' from source: unknown 34589 1727204125.75088: calling self._execute() 34589 1727204125.75186: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204125.75314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204125.75317: variable 'omit' from source: magic vars 34589 1727204125.75629: variable 'ansible_distribution_major_version' from source: facts 34589 1727204125.75650: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204125.75769: variable 'connection_failed' from source: set_fact 34589 1727204125.75784: Evaluated conditional (not connection_failed): True 34589 1727204125.75910: variable 'ansible_distribution_major_version' from source: facts 34589 1727204125.75923: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204125.76032: variable 'connection_failed' from source: set_fact 34589 1727204125.76044: Evaluated conditional (not connection_failed): True 34589 1727204125.76167: variable 'network_provider' from source: set_fact 34589 1727204125.76186: Evaluated conditional (network_provider == "initscripts"): False 34589 1727204125.76298: when evaluation is False, skipping this task 34589 1727204125.76302: _execute() done 34589 1727204125.76304: dumping result to json 34589 1727204125.76309: done dumping result, returning 34589 1727204125.76313: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [028d2410-947f-a9c6-cddc-000000000075] 34589 1727204125.76315: sending task result for task 028d2410-947f-a9c6-cddc-000000000075 34589 1727204125.76387: done sending task result for task 028d2410-947f-a9c6-cddc-000000000075 34589 1727204125.76390: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34589 1727204125.76451: no more pending results, returning what we have 34589 1727204125.76455: results queue empty 34589 1727204125.76456: checking for any_errors_fatal 34589 1727204125.76466: done checking for any_errors_fatal 34589 1727204125.76468: checking for max_fail_percentage 34589 1727204125.76470: done checking for max_fail_percentage 34589 1727204125.76474: checking to see if all hosts have failed and the running result is not ok 34589 1727204125.76474: done checking to see if all hosts have failed 34589 1727204125.76579: getting the remaining hosts for this loop 34589 1727204125.76582: done getting the remaining hosts for this loop 34589 1727204125.76587: getting the next task for host managed-node1 34589 1727204125.76594: done getting next task for host managed-node1 34589 1727204125.76598: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34589 1727204125.76602: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204125.76620: getting variables 34589 1727204125.76622: in VariableManager get_vars() 34589 1727204125.76664: Calling all_inventory to load vars for managed-node1 34589 1727204125.76667: Calling groups_inventory to load vars for managed-node1 34589 1727204125.76670: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204125.76864: Calling all_plugins_play to load vars for managed-node1 34589 1727204125.76868: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204125.76871: Calling groups_plugins_play to load vars for managed-node1 34589 1727204125.77811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204125.78689: done with get_vars() 34589 1727204125.78707: done getting variables 34589 1727204125.78749: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:55:25 -0400 (0:00:00.046) 0:00:25.922 ***** 34589 1727204125.78770: entering _queue_task() for managed-node1/copy 34589 1727204125.79011: worker is 1 (out of 1 available) 34589 1727204125.79025: exiting _queue_task() for managed-node1/copy 34589 1727204125.79037: done queuing things up, now waiting for results queue to drain 34589 1727204125.79038: waiting for pending results... 34589 1727204125.79395: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34589 1727204125.79401: in run() - task 028d2410-947f-a9c6-cddc-000000000076 34589 1727204125.79404: variable 'ansible_search_path' from source: unknown 34589 1727204125.79406: variable 'ansible_search_path' from source: unknown 34589 1727204125.79425: calling self._execute() 34589 1727204125.79535: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204125.79547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204125.79562: variable 'omit' from source: magic vars 34589 1727204125.79942: variable 'ansible_distribution_major_version' from source: facts 34589 1727204125.79957: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204125.80080: variable 'connection_failed' from source: set_fact 34589 1727204125.80157: Evaluated conditional (not connection_failed): True 34589 1727204125.80202: variable 'ansible_distribution_major_version' from source: facts 34589 1727204125.80212: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204125.80320: variable 'connection_failed' from source: set_fact 34589 1727204125.80330: Evaluated conditional (not connection_failed): True 34589 1727204125.80448: variable 'network_provider' from source: set_fact 34589 1727204125.80458: Evaluated conditional (network_provider == "initscripts"): False 34589 1727204125.80465: when evaluation is False, skipping this task 34589 1727204125.80472: _execute() done 34589 1727204125.80487: dumping result to json 34589 1727204125.80494: done dumping result, returning 34589 1727204125.80508: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [028d2410-947f-a9c6-cddc-000000000076] 34589 1727204125.80517: sending task result for task 028d2410-947f-a9c6-cddc-000000000076 34589 1727204125.80720: done sending task result for task 028d2410-947f-a9c6-cddc-000000000076 34589 1727204125.80724: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 34589 1727204125.80771: no more pending results, returning what we have 34589 1727204125.80774: results queue empty 34589 1727204125.80777: checking for any_errors_fatal 34589 1727204125.80787: done checking for any_errors_fatal 34589 1727204125.80788: checking for max_fail_percentage 34589 1727204125.80789: done checking for max_fail_percentage 34589 1727204125.80791: checking to see if all hosts have failed and the running result is not ok 34589 1727204125.80791: done checking to see if all hosts have failed 34589 1727204125.80792: getting the remaining hosts for this loop 34589 1727204125.80793: done getting the remaining hosts for this loop 34589 1727204125.80797: getting the next task for host managed-node1 34589 1727204125.80802: done getting next task for host managed-node1 34589 1727204125.80806: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34589 1727204125.80810: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204125.80823: getting variables 34589 1727204125.80825: in VariableManager get_vars() 34589 1727204125.80858: Calling all_inventory to load vars for managed-node1 34589 1727204125.80860: Calling groups_inventory to load vars for managed-node1 34589 1727204125.80862: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204125.80870: Calling all_plugins_play to load vars for managed-node1 34589 1727204125.80873: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204125.80877: Calling groups_plugins_play to load vars for managed-node1 34589 1727204125.82369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204125.83980: done with get_vars() 34589 1727204125.84001: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:55:25 -0400 (0:00:00.052) 0:00:25.975 ***** 34589 1727204125.84061: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 34589 1727204125.84315: worker is 1 (out of 1 available) 34589 1727204125.84330: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 34589 1727204125.84344: done queuing things up, now waiting for results queue to drain 34589 1727204125.84346: waiting for pending results... 34589 1727204125.84522: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34589 1727204125.84587: in run() - task 028d2410-947f-a9c6-cddc-000000000077 34589 1727204125.84600: variable 'ansible_search_path' from source: unknown 34589 1727204125.84604: variable 'ansible_search_path' from source: unknown 34589 1727204125.84631: calling self._execute() 34589 1727204125.84781: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204125.84785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204125.84789: variable 'omit' from source: magic vars 34589 1727204125.85181: variable 'ansible_distribution_major_version' from source: facts 34589 1727204125.85206: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204125.85381: variable 'connection_failed' from source: set_fact 34589 1727204125.85384: Evaluated conditional (not connection_failed): True 34589 1727204125.85460: variable 'ansible_distribution_major_version' from source: facts 34589 1727204125.85471: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204125.85591: variable 'connection_failed' from source: set_fact 34589 1727204125.85605: Evaluated conditional (not connection_failed): True 34589 1727204125.85619: variable 'omit' from source: magic vars 34589 1727204125.85666: variable 'omit' from source: magic vars 34589 1727204125.85857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204125.88143: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204125.88170: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204125.88216: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204125.88261: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204125.88292: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204125.88383: variable 'network_provider' from source: set_fact 34589 1727204125.88538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204125.88574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204125.88782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204125.88786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204125.88789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204125.88791: variable 'omit' from source: magic vars 34589 1727204125.88866: variable 'omit' from source: magic vars 34589 1727204125.88982: variable 'network_connections' from source: play vars 34589 1727204125.88999: variable 'profile' from source: play vars 34589 1727204125.89072: variable 'profile' from source: play vars 34589 1727204125.89085: variable 'interface' from source: set_fact 34589 1727204125.89155: variable 'interface' from source: set_fact 34589 1727204125.89311: variable 'omit' from source: magic vars 34589 1727204125.89326: variable '__lsr_ansible_managed' from source: task vars 34589 1727204125.89396: variable '__lsr_ansible_managed' from source: task vars 34589 1727204125.89689: Loaded config def from plugin (lookup/template) 34589 1727204125.89699: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 34589 1727204125.89737: File lookup term: get_ansible_managed.j2 34589 1727204125.89745: variable 'ansible_search_path' from source: unknown 34589 1727204125.89754: evaluation_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 34589 1727204125.89780: search_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 34589 1727204125.89804: variable 'ansible_search_path' from source: unknown 34589 1727204125.97750: variable 'ansible_managed' from source: unknown 34589 1727204125.97874: variable 'omit' from source: magic vars 34589 1727204125.97903: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204125.97934: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204125.97965: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204125.97968: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204125.98075: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204125.98080: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204125.98082: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204125.98084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204125.98106: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204125.98112: Set connection var ansible_shell_executable to /bin/sh 34589 1727204125.98119: Set connection var ansible_timeout to 10 34589 1727204125.98127: Set connection var ansible_shell_type to sh 34589 1727204125.98135: Set connection var ansible_connection to ssh 34589 1727204125.98140: Set connection var ansible_pipelining to False 34589 1727204125.98162: variable 'ansible_shell_executable' from source: unknown 34589 1727204125.98165: variable 'ansible_connection' from source: unknown 34589 1727204125.98167: variable 'ansible_module_compression' from source: unknown 34589 1727204125.98169: variable 'ansible_shell_type' from source: unknown 34589 1727204125.98171: variable 'ansible_shell_executable' from source: unknown 34589 1727204125.98187: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204125.98189: variable 'ansible_pipelining' from source: unknown 34589 1727204125.98191: variable 'ansible_timeout' from source: unknown 34589 1727204125.98193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204125.98318: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34589 1727204125.98327: variable 'omit' from source: magic vars 34589 1727204125.98334: starting attempt loop 34589 1727204125.98336: running the handler 34589 1727204125.98401: _low_level_execute_command(): starting 34589 1727204125.98404: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204125.99104: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204125.99181: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204125.99184: stderr chunk (state=3): >>>debug2: match found <<< 34589 1727204125.99187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204125.99189: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204125.99198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204125.99222: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204125.99437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204126.01226: stdout chunk (state=3): >>>/root <<< 34589 1727204126.01482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204126.01485: stdout chunk (state=3): >>><<< 34589 1727204126.01487: stderr chunk (state=3): >>><<< 34589 1727204126.01490: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204126.01492: _low_level_execute_command(): starting 34589 1727204126.01495: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204126.0146117-37216-132732356937942 `" && echo ansible-tmp-1727204126.0146117-37216-132732356937942="` echo /root/.ansible/tmp/ansible-tmp-1727204126.0146117-37216-132732356937942 `" ) && sleep 0' 34589 1727204126.02293: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204126.02302: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204126.02390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204126.02403: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204126.02414: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204126.02433: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204126.02541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204126.04623: stdout chunk (state=3): >>>ansible-tmp-1727204126.0146117-37216-132732356937942=/root/.ansible/tmp/ansible-tmp-1727204126.0146117-37216-132732356937942 <<< 34589 1727204126.04767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204126.04785: stderr chunk (state=3): >>><<< 34589 1727204126.04799: stdout chunk (state=3): >>><<< 34589 1727204126.04981: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204126.0146117-37216-132732356937942=/root/.ansible/tmp/ansible-tmp-1727204126.0146117-37216-132732356937942 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204126.04985: variable 'ansible_module_compression' from source: unknown 34589 1727204126.04988: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 34589 1727204126.04990: variable 'ansible_facts' from source: unknown 34589 1727204126.05088: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204126.0146117-37216-132732356937942/AnsiballZ_network_connections.py 34589 1727204126.05243: Sending initial data 34589 1727204126.05288: Sent initial data (168 bytes) 34589 1727204126.06082: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204126.06086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204126.06088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204126.06091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204126.06104: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204126.06117: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204126.06277: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204126.06393: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204126.08119: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 34589 1727204126.08149: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204126.08310: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204126.08404: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmpc5mgh8r9 /root/.ansible/tmp/ansible-tmp-1727204126.0146117-37216-132732356937942/AnsiballZ_network_connections.py <<< 34589 1727204126.08408: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204126.0146117-37216-132732356937942/AnsiballZ_network_connections.py" <<< 34589 1727204126.08488: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmpc5mgh8r9" to remote "/root/.ansible/tmp/ansible-tmp-1727204126.0146117-37216-132732356937942/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204126.0146117-37216-132732356937942/AnsiballZ_network_connections.py" <<< 34589 1727204126.09881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204126.09942: stderr chunk (state=3): >>><<< 34589 1727204126.09950: stdout chunk (state=3): >>><<< 34589 1727204126.10100: done transferring module to remote 34589 1727204126.10104: _low_level_execute_command(): starting 34589 1727204126.10109: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204126.0146117-37216-132732356937942/ /root/.ansible/tmp/ansible-tmp-1727204126.0146117-37216-132732356937942/AnsiballZ_network_connections.py && sleep 0' 34589 1727204126.10714: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204126.10729: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204126.10746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204126.10764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204126.10792: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204126.10893: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204126.10919: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204126.10934: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204126.11043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204126.13070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204126.13075: stdout chunk (state=3): >>><<< 34589 1727204126.13192: stderr chunk (state=3): >>><<< 34589 1727204126.13196: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204126.13199: _low_level_execute_command(): starting 34589 1727204126.13201: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204126.0146117-37216-132732356937942/AnsiballZ_network_connections.py && sleep 0' 34589 1727204126.13792: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204126.13815: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204126.13894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204126.13955: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204126.13974: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204126.14002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204126.14131: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204126.42961: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 34589 1727204126.45177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204126.45182: stdout chunk (state=3): >>><<< 34589 1727204126.45185: stderr chunk (state=3): >>><<< 34589 1727204126.45331: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204126.45335: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204126.0146117-37216-132732356937942/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204126.45338: _low_level_execute_command(): starting 34589 1727204126.45340: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204126.0146117-37216-132732356937942/ > /dev/null 2>&1 && sleep 0' 34589 1727204126.45935: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204126.45950: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204126.45965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204126.46014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204126.46029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204126.46043: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204126.46130: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204126.46178: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204126.46257: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204126.48380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204126.48384: stdout chunk (state=3): >>><<< 34589 1727204126.48386: stderr chunk (state=3): >>><<< 34589 1727204126.48388: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204126.48390: handler run complete 34589 1727204126.48392: attempt loop complete, returning result 34589 1727204126.48394: _execute() done 34589 1727204126.48396: dumping result to json 34589 1727204126.48398: done dumping result, returning 34589 1727204126.48400: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [028d2410-947f-a9c6-cddc-000000000077] 34589 1727204126.48402: sending task result for task 028d2410-947f-a9c6-cddc-000000000077 34589 1727204126.48479: done sending task result for task 028d2410-947f-a9c6-cddc-000000000077 34589 1727204126.48483: WORKER PROCESS EXITING ok: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: 34589 1727204126.48781: no more pending results, returning what we have 34589 1727204126.48785: results queue empty 34589 1727204126.48786: checking for any_errors_fatal 34589 1727204126.48792: done checking for any_errors_fatal 34589 1727204126.48793: checking for max_fail_percentage 34589 1727204126.48795: done checking for max_fail_percentage 34589 1727204126.48796: checking to see if all hosts have failed and the running result is not ok 34589 1727204126.48797: done checking to see if all hosts have failed 34589 1727204126.48798: getting the remaining hosts for this loop 34589 1727204126.48799: done getting the remaining hosts for this loop 34589 1727204126.48803: getting the next task for host managed-node1 34589 1727204126.48812: done getting next task for host managed-node1 34589 1727204126.48815: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 34589 1727204126.48817: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204126.48827: getting variables 34589 1727204126.48829: in VariableManager get_vars() 34589 1727204126.48866: Calling all_inventory to load vars for managed-node1 34589 1727204126.48869: Calling groups_inventory to load vars for managed-node1 34589 1727204126.48871: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204126.48887: Calling all_plugins_play to load vars for managed-node1 34589 1727204126.48890: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204126.48893: Calling groups_plugins_play to load vars for managed-node1 34589 1727204126.50436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204126.52100: done with get_vars() 34589 1727204126.52131: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:55:26 -0400 (0:00:00.681) 0:00:26.657 ***** 34589 1727204126.52227: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 34589 1727204126.52717: worker is 1 (out of 1 available) 34589 1727204126.52727: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 34589 1727204126.52738: done queuing things up, now waiting for results queue to drain 34589 1727204126.52739: waiting for pending results... 34589 1727204126.52980: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 34589 1727204126.53076: in run() - task 028d2410-947f-a9c6-cddc-000000000078 34589 1727204126.53085: variable 'ansible_search_path' from source: unknown 34589 1727204126.53117: variable 'ansible_search_path' from source: unknown 34589 1727204126.53139: calling self._execute() 34589 1727204126.53251: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204126.53262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204126.53289: variable 'omit' from source: magic vars 34589 1727204126.53687: variable 'ansible_distribution_major_version' from source: facts 34589 1727204126.53724: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204126.53836: variable 'connection_failed' from source: set_fact 34589 1727204126.53846: Evaluated conditional (not connection_failed): True 34589 1727204126.53982: variable 'ansible_distribution_major_version' from source: facts 34589 1727204126.53986: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204126.54083: variable 'connection_failed' from source: set_fact 34589 1727204126.54201: Evaluated conditional (not connection_failed): True 34589 1727204126.54220: variable 'network_state' from source: role '' defaults 34589 1727204126.54234: Evaluated conditional (network_state != {}): False 34589 1727204126.54240: when evaluation is False, skipping this task 34589 1727204126.54247: _execute() done 34589 1727204126.54254: dumping result to json 34589 1727204126.54261: done dumping result, returning 34589 1727204126.54271: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [028d2410-947f-a9c6-cddc-000000000078] 34589 1727204126.54282: sending task result for task 028d2410-947f-a9c6-cddc-000000000078 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34589 1727204126.54470: no more pending results, returning what we have 34589 1727204126.54474: results queue empty 34589 1727204126.54475: checking for any_errors_fatal 34589 1727204126.54492: done checking for any_errors_fatal 34589 1727204126.54492: checking for max_fail_percentage 34589 1727204126.54494: done checking for max_fail_percentage 34589 1727204126.54495: checking to see if all hosts have failed and the running result is not ok 34589 1727204126.54496: done checking to see if all hosts have failed 34589 1727204126.54497: getting the remaining hosts for this loop 34589 1727204126.54498: done getting the remaining hosts for this loop 34589 1727204126.54502: getting the next task for host managed-node1 34589 1727204126.54510: done getting next task for host managed-node1 34589 1727204126.54515: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34589 1727204126.54630: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204126.54647: getting variables 34589 1727204126.54649: in VariableManager get_vars() 34589 1727204126.54690: Calling all_inventory to load vars for managed-node1 34589 1727204126.54693: Calling groups_inventory to load vars for managed-node1 34589 1727204126.54696: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204126.54711: Calling all_plugins_play to load vars for managed-node1 34589 1727204126.54714: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204126.54717: Calling groups_plugins_play to load vars for managed-node1 34589 1727204126.55249: done sending task result for task 028d2410-947f-a9c6-cddc-000000000078 34589 1727204126.55252: WORKER PROCESS EXITING 34589 1727204126.56385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204126.58198: done with get_vars() 34589 1727204126.58223: done getting variables 34589 1727204126.58407: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:55:26 -0400 (0:00:00.062) 0:00:26.719 ***** 34589 1727204126.58436: entering _queue_task() for managed-node1/debug 34589 1727204126.58799: worker is 1 (out of 1 available) 34589 1727204126.58880: exiting _queue_task() for managed-node1/debug 34589 1727204126.58892: done queuing things up, now waiting for results queue to drain 34589 1727204126.58894: waiting for pending results... 34589 1727204126.59130: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34589 1727204126.59241: in run() - task 028d2410-947f-a9c6-cddc-000000000079 34589 1727204126.59273: variable 'ansible_search_path' from source: unknown 34589 1727204126.59285: variable 'ansible_search_path' from source: unknown 34589 1727204126.59324: calling self._execute() 34589 1727204126.59457: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204126.59477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204126.59495: variable 'omit' from source: magic vars 34589 1727204126.59971: variable 'ansible_distribution_major_version' from source: facts 34589 1727204126.59990: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204126.60108: variable 'connection_failed' from source: set_fact 34589 1727204126.60124: Evaluated conditional (not connection_failed): True 34589 1727204126.60238: variable 'ansible_distribution_major_version' from source: facts 34589 1727204126.60249: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204126.60354: variable 'connection_failed' from source: set_fact 34589 1727204126.60364: Evaluated conditional (not connection_failed): True 34589 1727204126.60377: variable 'omit' from source: magic vars 34589 1727204126.60417: variable 'omit' from source: magic vars 34589 1727204126.60462: variable 'omit' from source: magic vars 34589 1727204126.60510: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204126.60548: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204126.60673: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204126.60678: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204126.60680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204126.60683: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204126.60685: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204126.60687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204126.60760: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204126.60770: Set connection var ansible_shell_executable to /bin/sh 34589 1727204126.60792: Set connection var ansible_timeout to 10 34589 1727204126.60880: Set connection var ansible_shell_type to sh 34589 1727204126.60888: Set connection var ansible_connection to ssh 34589 1727204126.60891: Set connection var ansible_pipelining to False 34589 1727204126.60893: variable 'ansible_shell_executable' from source: unknown 34589 1727204126.60895: variable 'ansible_connection' from source: unknown 34589 1727204126.60897: variable 'ansible_module_compression' from source: unknown 34589 1727204126.60899: variable 'ansible_shell_type' from source: unknown 34589 1727204126.60901: variable 'ansible_shell_executable' from source: unknown 34589 1727204126.60903: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204126.60905: variable 'ansible_pipelining' from source: unknown 34589 1727204126.60906: variable 'ansible_timeout' from source: unknown 34589 1727204126.60908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204126.61081: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204126.61084: variable 'omit' from source: magic vars 34589 1727204126.61086: starting attempt loop 34589 1727204126.61088: running the handler 34589 1727204126.61181: variable '__network_connections_result' from source: set_fact 34589 1727204126.61243: handler run complete 34589 1727204126.61264: attempt loop complete, returning result 34589 1727204126.61270: _execute() done 34589 1727204126.61279: dumping result to json 34589 1727204126.61286: done dumping result, returning 34589 1727204126.61298: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [028d2410-947f-a9c6-cddc-000000000079] 34589 1727204126.61324: sending task result for task 028d2410-947f-a9c6-cddc-000000000079 ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "" ] } 34589 1727204126.61600: no more pending results, returning what we have 34589 1727204126.61604: results queue empty 34589 1727204126.61605: checking for any_errors_fatal 34589 1727204126.61612: done checking for any_errors_fatal 34589 1727204126.61613: checking for max_fail_percentage 34589 1727204126.61615: done checking for max_fail_percentage 34589 1727204126.61616: checking to see if all hosts have failed and the running result is not ok 34589 1727204126.61617: done checking to see if all hosts have failed 34589 1727204126.61617: getting the remaining hosts for this loop 34589 1727204126.61619: done getting the remaining hosts for this loop 34589 1727204126.61623: getting the next task for host managed-node1 34589 1727204126.61629: done getting next task for host managed-node1 34589 1727204126.61633: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34589 1727204126.61635: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204126.61644: getting variables 34589 1727204126.61646: in VariableManager get_vars() 34589 1727204126.61686: Calling all_inventory to load vars for managed-node1 34589 1727204126.61690: Calling groups_inventory to load vars for managed-node1 34589 1727204126.61693: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204126.61704: Calling all_plugins_play to load vars for managed-node1 34589 1727204126.61707: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204126.61711: Calling groups_plugins_play to load vars for managed-node1 34589 1727204126.62683: done sending task result for task 028d2410-947f-a9c6-cddc-000000000079 34589 1727204126.62686: WORKER PROCESS EXITING 34589 1727204126.64787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204126.67492: done with get_vars() 34589 1727204126.67515: done getting variables 34589 1727204126.67574: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:55:26 -0400 (0:00:00.091) 0:00:26.811 ***** 34589 1727204126.67614: entering _queue_task() for managed-node1/debug 34589 1727204126.68067: worker is 1 (out of 1 available) 34589 1727204126.68083: exiting _queue_task() for managed-node1/debug 34589 1727204126.68094: done queuing things up, now waiting for results queue to drain 34589 1727204126.68096: waiting for pending results... 34589 1727204126.68328: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34589 1727204126.68418: in run() - task 028d2410-947f-a9c6-cddc-00000000007a 34589 1727204126.68432: variable 'ansible_search_path' from source: unknown 34589 1727204126.68436: variable 'ansible_search_path' from source: unknown 34589 1727204126.68484: calling self._execute() 34589 1727204126.68591: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204126.68597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204126.68609: variable 'omit' from source: magic vars 34589 1727204126.69017: variable 'ansible_distribution_major_version' from source: facts 34589 1727204126.69090: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204126.69267: variable 'connection_failed' from source: set_fact 34589 1727204126.69451: Evaluated conditional (not connection_failed): True 34589 1727204126.69454: variable 'ansible_distribution_major_version' from source: facts 34589 1727204126.69594: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204126.69692: variable 'connection_failed' from source: set_fact 34589 1727204126.69695: Evaluated conditional (not connection_failed): True 34589 1727204126.69795: variable 'omit' from source: magic vars 34589 1727204126.69838: variable 'omit' from source: magic vars 34589 1727204126.69881: variable 'omit' from source: magic vars 34589 1727204126.70080: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204126.70084: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204126.70086: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204126.70089: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204126.70091: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204126.70094: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204126.70096: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204126.70099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204126.70156: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204126.70162: Set connection var ansible_shell_executable to /bin/sh 34589 1727204126.70170: Set connection var ansible_timeout to 10 34589 1727204126.70173: Set connection var ansible_shell_type to sh 34589 1727204126.70181: Set connection var ansible_connection to ssh 34589 1727204126.70186: Set connection var ansible_pipelining to False 34589 1727204126.70212: variable 'ansible_shell_executable' from source: unknown 34589 1727204126.70215: variable 'ansible_connection' from source: unknown 34589 1727204126.70218: variable 'ansible_module_compression' from source: unknown 34589 1727204126.70224: variable 'ansible_shell_type' from source: unknown 34589 1727204126.70228: variable 'ansible_shell_executable' from source: unknown 34589 1727204126.70230: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204126.70232: variable 'ansible_pipelining' from source: unknown 34589 1727204126.70235: variable 'ansible_timeout' from source: unknown 34589 1727204126.70246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204126.70390: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204126.70402: variable 'omit' from source: magic vars 34589 1727204126.70410: starting attempt loop 34589 1727204126.70413: running the handler 34589 1727204126.70467: variable '__network_connections_result' from source: set_fact 34589 1727204126.70536: variable '__network_connections_result' from source: set_fact 34589 1727204126.70642: handler run complete 34589 1727204126.70671: attempt loop complete, returning result 34589 1727204126.70679: _execute() done 34589 1727204126.70682: dumping result to json 34589 1727204126.70688: done dumping result, returning 34589 1727204126.70697: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [028d2410-947f-a9c6-cddc-00000000007a] 34589 1727204126.70702: sending task result for task 028d2410-947f-a9c6-cddc-00000000007a ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 34589 1727204126.70874: no more pending results, returning what we have 34589 1727204126.70880: results queue empty 34589 1727204126.70881: checking for any_errors_fatal 34589 1727204126.70893: done checking for any_errors_fatal 34589 1727204126.70894: checking for max_fail_percentage 34589 1727204126.70896: done checking for max_fail_percentage 34589 1727204126.70897: checking to see if all hosts have failed and the running result is not ok 34589 1727204126.70897: done checking to see if all hosts have failed 34589 1727204126.70898: getting the remaining hosts for this loop 34589 1727204126.70899: done getting the remaining hosts for this loop 34589 1727204126.70903: getting the next task for host managed-node1 34589 1727204126.70908: done getting next task for host managed-node1 34589 1727204126.70913: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34589 1727204126.70915: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204126.70924: getting variables 34589 1727204126.70925: in VariableManager get_vars() 34589 1727204126.70961: Calling all_inventory to load vars for managed-node1 34589 1727204126.70964: Calling groups_inventory to load vars for managed-node1 34589 1727204126.70966: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204126.71108: Calling all_plugins_play to load vars for managed-node1 34589 1727204126.71113: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204126.71117: Calling groups_plugins_play to load vars for managed-node1 34589 1727204126.71638: done sending task result for task 028d2410-947f-a9c6-cddc-00000000007a 34589 1727204126.71642: WORKER PROCESS EXITING 34589 1727204126.74146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204126.75922: done with get_vars() 34589 1727204126.75951: done getting variables 34589 1727204126.76019: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:55:26 -0400 (0:00:00.084) 0:00:26.895 ***** 34589 1727204126.76056: entering _queue_task() for managed-node1/debug 34589 1727204126.76753: worker is 1 (out of 1 available) 34589 1727204126.76766: exiting _queue_task() for managed-node1/debug 34589 1727204126.76781: done queuing things up, now waiting for results queue to drain 34589 1727204126.76782: waiting for pending results... 34589 1727204126.77285: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34589 1727204126.77519: in run() - task 028d2410-947f-a9c6-cddc-00000000007b 34589 1727204126.77769: variable 'ansible_search_path' from source: unknown 34589 1727204126.77781: variable 'ansible_search_path' from source: unknown 34589 1727204126.77825: calling self._execute() 34589 1727204126.78049: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204126.78064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204126.78382: variable 'omit' from source: magic vars 34589 1727204126.79287: variable 'ansible_distribution_major_version' from source: facts 34589 1727204126.79305: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204126.79423: variable 'connection_failed' from source: set_fact 34589 1727204126.79460: Evaluated conditional (not connection_failed): True 34589 1727204126.79584: variable 'ansible_distribution_major_version' from source: facts 34589 1727204126.79595: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204126.79702: variable 'connection_failed' from source: set_fact 34589 1727204126.79715: Evaluated conditional (not connection_failed): True 34589 1727204126.79832: variable 'network_state' from source: role '' defaults 34589 1727204126.79849: Evaluated conditional (network_state != {}): False 34589 1727204126.79856: when evaluation is False, skipping this task 34589 1727204126.79863: _execute() done 34589 1727204126.79870: dumping result to json 34589 1727204126.79880: done dumping result, returning 34589 1727204126.79895: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [028d2410-947f-a9c6-cddc-00000000007b] 34589 1727204126.79909: sending task result for task 028d2410-947f-a9c6-cddc-00000000007b skipping: [managed-node1] => { "false_condition": "network_state != {}" } 34589 1727204126.80063: no more pending results, returning what we have 34589 1727204126.80067: results queue empty 34589 1727204126.80068: checking for any_errors_fatal 34589 1727204126.80080: done checking for any_errors_fatal 34589 1727204126.80081: checking for max_fail_percentage 34589 1727204126.80083: done checking for max_fail_percentage 34589 1727204126.80084: checking to see if all hosts have failed and the running result is not ok 34589 1727204126.80085: done checking to see if all hosts have failed 34589 1727204126.80086: getting the remaining hosts for this loop 34589 1727204126.80087: done getting the remaining hosts for this loop 34589 1727204126.80092: getting the next task for host managed-node1 34589 1727204126.80098: done getting next task for host managed-node1 34589 1727204126.80101: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 34589 1727204126.80104: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204126.80123: done sending task result for task 028d2410-947f-a9c6-cddc-00000000007b 34589 1727204126.80127: WORKER PROCESS EXITING 34589 1727204126.80234: getting variables 34589 1727204126.80236: in VariableManager get_vars() 34589 1727204126.80283: Calling all_inventory to load vars for managed-node1 34589 1727204126.80286: Calling groups_inventory to load vars for managed-node1 34589 1727204126.80288: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204126.80301: Calling all_plugins_play to load vars for managed-node1 34589 1727204126.80305: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204126.80308: Calling groups_plugins_play to load vars for managed-node1 34589 1727204126.90334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204126.92408: done with get_vars() 34589 1727204126.92439: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:55:26 -0400 (0:00:00.164) 0:00:27.060 ***** 34589 1727204126.92526: entering _queue_task() for managed-node1/ping 34589 1727204126.93049: worker is 1 (out of 1 available) 34589 1727204126.93062: exiting _queue_task() for managed-node1/ping 34589 1727204126.93074: done queuing things up, now waiting for results queue to drain 34589 1727204126.93078: waiting for pending results... 34589 1727204126.93366: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 34589 1727204126.93481: in run() - task 028d2410-947f-a9c6-cddc-00000000007c 34589 1727204126.93495: variable 'ansible_search_path' from source: unknown 34589 1727204126.93499: variable 'ansible_search_path' from source: unknown 34589 1727204126.93580: calling self._execute() 34589 1727204126.93755: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204126.93877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204126.93981: variable 'omit' from source: magic vars 34589 1727204126.94762: variable 'ansible_distribution_major_version' from source: facts 34589 1727204126.94766: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204126.94885: variable 'connection_failed' from source: set_fact 34589 1727204126.94889: Evaluated conditional (not connection_failed): True 34589 1727204126.95009: variable 'ansible_distribution_major_version' from source: facts 34589 1727204126.95012: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204126.95111: variable 'connection_failed' from source: set_fact 34589 1727204126.95115: Evaluated conditional (not connection_failed): True 34589 1727204126.95121: variable 'omit' from source: magic vars 34589 1727204126.95178: variable 'omit' from source: magic vars 34589 1727204126.95202: variable 'omit' from source: magic vars 34589 1727204126.95249: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204126.95283: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204126.95297: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204126.95340: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204126.95343: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204126.95359: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204126.95363: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204126.95366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204126.95506: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204126.95512: Set connection var ansible_shell_executable to /bin/sh 34589 1727204126.95516: Set connection var ansible_timeout to 10 34589 1727204126.95519: Set connection var ansible_shell_type to sh 34589 1727204126.95522: Set connection var ansible_connection to ssh 34589 1727204126.95525: Set connection var ansible_pipelining to False 34589 1727204126.95528: variable 'ansible_shell_executable' from source: unknown 34589 1727204126.95531: variable 'ansible_connection' from source: unknown 34589 1727204126.95534: variable 'ansible_module_compression' from source: unknown 34589 1727204126.95569: variable 'ansible_shell_type' from source: unknown 34589 1727204126.95573: variable 'ansible_shell_executable' from source: unknown 34589 1727204126.95578: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204126.95580: variable 'ansible_pipelining' from source: unknown 34589 1727204126.95583: variable 'ansible_timeout' from source: unknown 34589 1727204126.95585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204126.95782: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34589 1727204126.95787: variable 'omit' from source: magic vars 34589 1727204126.95789: starting attempt loop 34589 1727204126.95792: running the handler 34589 1727204126.95845: _low_level_execute_command(): starting 34589 1727204126.95847: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204126.96596: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204126.96616: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204126.96668: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204126.96672: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204126.96891: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204126.98586: stdout chunk (state=3): >>>/root <<< 34589 1727204126.98686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204126.98726: stderr chunk (state=3): >>><<< 34589 1727204126.98728: stdout chunk (state=3): >>><<< 34589 1727204126.98748: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204126.98781: _low_level_execute_command(): starting 34589 1727204126.98785: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204126.987536-37269-131979459510514 `" && echo ansible-tmp-1727204126.987536-37269-131979459510514="` echo /root/.ansible/tmp/ansible-tmp-1727204126.987536-37269-131979459510514 `" ) && sleep 0' 34589 1727204126.99166: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204126.99180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204126.99210: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204126.99214: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204126.99216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 34589 1727204126.99218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204126.99265: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204126.99268: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204126.99355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204127.01469: stdout chunk (state=3): >>>ansible-tmp-1727204126.987536-37269-131979459510514=/root/.ansible/tmp/ansible-tmp-1727204126.987536-37269-131979459510514 <<< 34589 1727204127.01587: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204127.01611: stderr chunk (state=3): >>><<< 34589 1727204127.01613: stdout chunk (state=3): >>><<< 34589 1727204127.01682: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204126.987536-37269-131979459510514=/root/.ansible/tmp/ansible-tmp-1727204126.987536-37269-131979459510514 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204127.01685: variable 'ansible_module_compression' from source: unknown 34589 1727204127.01694: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 34589 1727204127.01726: variable 'ansible_facts' from source: unknown 34589 1727204127.01779: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204126.987536-37269-131979459510514/AnsiballZ_ping.py 34589 1727204127.01873: Sending initial data 34589 1727204127.01879: Sent initial data (152 bytes) 34589 1727204127.02293: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204127.02297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204127.02301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204127.02303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204127.02356: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204127.02363: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204127.02441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204127.04171: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 34589 1727204127.04182: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204127.04241: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204127.04316: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmp92yvo_4x /root/.ansible/tmp/ansible-tmp-1727204126.987536-37269-131979459510514/AnsiballZ_ping.py <<< 34589 1727204127.04319: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204126.987536-37269-131979459510514/AnsiballZ_ping.py" <<< 34589 1727204127.04390: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmp92yvo_4x" to remote "/root/.ansible/tmp/ansible-tmp-1727204126.987536-37269-131979459510514/AnsiballZ_ping.py" <<< 34589 1727204127.04394: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204126.987536-37269-131979459510514/AnsiballZ_ping.py" <<< 34589 1727204127.05051: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204127.05087: stderr chunk (state=3): >>><<< 34589 1727204127.05090: stdout chunk (state=3): >>><<< 34589 1727204127.05116: done transferring module to remote 34589 1727204127.05125: _low_level_execute_command(): starting 34589 1727204127.05134: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204126.987536-37269-131979459510514/ /root/.ansible/tmp/ansible-tmp-1727204126.987536-37269-131979459510514/AnsiballZ_ping.py && sleep 0' 34589 1727204127.05543: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204127.05578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204127.05582: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204127.05589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204127.05591: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204127.05593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204127.05644: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204127.05661: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204127.05761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204127.07745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204127.07763: stderr chunk (state=3): >>><<< 34589 1727204127.07766: stdout chunk (state=3): >>><<< 34589 1727204127.07784: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204127.07787: _low_level_execute_command(): starting 34589 1727204127.07791: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204126.987536-37269-131979459510514/AnsiballZ_ping.py && sleep 0' 34589 1727204127.08236: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204127.08239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 34589 1727204127.08243: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204127.08296: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204127.08303: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204127.08306: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204127.08387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204127.24689: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 34589 1727204127.26283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204127.26287: stdout chunk (state=3): >>><<< 34589 1727204127.26289: stderr chunk (state=3): >>><<< 34589 1727204127.26291: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204127.26295: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204126.987536-37269-131979459510514/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204127.26297: _low_level_execute_command(): starting 34589 1727204127.26303: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204126.987536-37269-131979459510514/ > /dev/null 2>&1 && sleep 0' 34589 1727204127.27110: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204127.27114: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204127.27116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204127.27118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204127.27121: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204127.27130: stderr chunk (state=3): >>>debug2: match not found <<< 34589 1727204127.27139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204127.27291: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 34589 1727204127.27295: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204127.27305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204127.27494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204127.29579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204127.29583: stdout chunk (state=3): >>><<< 34589 1727204127.29590: stderr chunk (state=3): >>><<< 34589 1727204127.29610: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204127.29613: handler run complete 34589 1727204127.29630: attempt loop complete, returning result 34589 1727204127.29633: _execute() done 34589 1727204127.29635: dumping result to json 34589 1727204127.29637: done dumping result, returning 34589 1727204127.29648: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [028d2410-947f-a9c6-cddc-00000000007c] 34589 1727204127.29651: sending task result for task 028d2410-947f-a9c6-cddc-00000000007c 34589 1727204127.29763: done sending task result for task 028d2410-947f-a9c6-cddc-00000000007c 34589 1727204127.29766: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 34589 1727204127.29845: no more pending results, returning what we have 34589 1727204127.29848: results queue empty 34589 1727204127.29849: checking for any_errors_fatal 34589 1727204127.29860: done checking for any_errors_fatal 34589 1727204127.29861: checking for max_fail_percentage 34589 1727204127.29863: done checking for max_fail_percentage 34589 1727204127.29864: checking to see if all hosts have failed and the running result is not ok 34589 1727204127.29865: done checking to see if all hosts have failed 34589 1727204127.29866: getting the remaining hosts for this loop 34589 1727204127.29867: done getting the remaining hosts for this loop 34589 1727204127.29871: getting the next task for host managed-node1 34589 1727204127.29883: done getting next task for host managed-node1 34589 1727204127.29887: ^ task is: TASK: meta (role_complete) 34589 1727204127.29889: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204127.29901: getting variables 34589 1727204127.29903: in VariableManager get_vars() 34589 1727204127.29944: Calling all_inventory to load vars for managed-node1 34589 1727204127.29947: Calling groups_inventory to load vars for managed-node1 34589 1727204127.29950: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204127.29961: Calling all_plugins_play to load vars for managed-node1 34589 1727204127.29965: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204127.29968: Calling groups_plugins_play to load vars for managed-node1 34589 1727204127.31572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204127.33239: done with get_vars() 34589 1727204127.33266: done getting variables 34589 1727204127.33361: done queuing things up, now waiting for results queue to drain 34589 1727204127.33364: results queue empty 34589 1727204127.33365: checking for any_errors_fatal 34589 1727204127.33368: done checking for any_errors_fatal 34589 1727204127.33369: checking for max_fail_percentage 34589 1727204127.33370: done checking for max_fail_percentage 34589 1727204127.33371: checking to see if all hosts have failed and the running result is not ok 34589 1727204127.33372: done checking to see if all hosts have failed 34589 1727204127.33373: getting the remaining hosts for this loop 34589 1727204127.33374: done getting the remaining hosts for this loop 34589 1727204127.33378: getting the next task for host managed-node1 34589 1727204127.33383: done getting next task for host managed-node1 34589 1727204127.33385: ^ task is: TASK: meta (flush_handlers) 34589 1727204127.33386: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204127.33389: getting variables 34589 1727204127.33390: in VariableManager get_vars() 34589 1727204127.33405: Calling all_inventory to load vars for managed-node1 34589 1727204127.33407: Calling groups_inventory to load vars for managed-node1 34589 1727204127.33409: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204127.33414: Calling all_plugins_play to load vars for managed-node1 34589 1727204127.33417: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204127.33419: Calling groups_plugins_play to load vars for managed-node1 34589 1727204127.34747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204127.36336: done with get_vars() 34589 1727204127.36362: done getting variables 34589 1727204127.36413: in VariableManager get_vars() 34589 1727204127.36425: Calling all_inventory to load vars for managed-node1 34589 1727204127.36427: Calling groups_inventory to load vars for managed-node1 34589 1727204127.36428: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204127.36433: Calling all_plugins_play to load vars for managed-node1 34589 1727204127.36434: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204127.36436: Calling groups_plugins_play to load vars for managed-node1 34589 1727204127.37606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204127.39232: done with get_vars() 34589 1727204127.39264: done queuing things up, now waiting for results queue to drain 34589 1727204127.39266: results queue empty 34589 1727204127.39267: checking for any_errors_fatal 34589 1727204127.39269: done checking for any_errors_fatal 34589 1727204127.39269: checking for max_fail_percentage 34589 1727204127.39270: done checking for max_fail_percentage 34589 1727204127.39271: checking to see if all hosts have failed and the running result is not ok 34589 1727204127.39272: done checking to see if all hosts have failed 34589 1727204127.39273: getting the remaining hosts for this loop 34589 1727204127.39274: done getting the remaining hosts for this loop 34589 1727204127.39279: getting the next task for host managed-node1 34589 1727204127.39282: done getting next task for host managed-node1 34589 1727204127.39284: ^ task is: TASK: meta (flush_handlers) 34589 1727204127.39285: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204127.39319: getting variables 34589 1727204127.39321: in VariableManager get_vars() 34589 1727204127.39335: Calling all_inventory to load vars for managed-node1 34589 1727204127.39337: Calling groups_inventory to load vars for managed-node1 34589 1727204127.39339: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204127.39345: Calling all_plugins_play to load vars for managed-node1 34589 1727204127.39347: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204127.39350: Calling groups_plugins_play to load vars for managed-node1 34589 1727204127.40591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204127.42630: done with get_vars() 34589 1727204127.42657: done getting variables 34589 1727204127.42823: in VariableManager get_vars() 34589 1727204127.42837: Calling all_inventory to load vars for managed-node1 34589 1727204127.42839: Calling groups_inventory to load vars for managed-node1 34589 1727204127.42841: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204127.42846: Calling all_plugins_play to load vars for managed-node1 34589 1727204127.42848: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204127.42851: Calling groups_plugins_play to load vars for managed-node1 34589 1727204127.44410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204127.47690: done with get_vars() 34589 1727204127.47726: done queuing things up, now waiting for results queue to drain 34589 1727204127.47729: results queue empty 34589 1727204127.47730: checking for any_errors_fatal 34589 1727204127.47731: done checking for any_errors_fatal 34589 1727204127.47732: checking for max_fail_percentage 34589 1727204127.47733: done checking for max_fail_percentage 34589 1727204127.47734: checking to see if all hosts have failed and the running result is not ok 34589 1727204127.47735: done checking to see if all hosts have failed 34589 1727204127.47735: getting the remaining hosts for this loop 34589 1727204127.47736: done getting the remaining hosts for this loop 34589 1727204127.47740: getting the next task for host managed-node1 34589 1727204127.47744: done getting next task for host managed-node1 34589 1727204127.47745: ^ task is: None 34589 1727204127.47746: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204127.47747: done queuing things up, now waiting for results queue to drain 34589 1727204127.47748: results queue empty 34589 1727204127.47749: checking for any_errors_fatal 34589 1727204127.47750: done checking for any_errors_fatal 34589 1727204127.47750: checking for max_fail_percentage 34589 1727204127.47751: done checking for max_fail_percentage 34589 1727204127.47752: checking to see if all hosts have failed and the running result is not ok 34589 1727204127.47752: done checking to see if all hosts have failed 34589 1727204127.47754: getting the next task for host managed-node1 34589 1727204127.47756: done getting next task for host managed-node1 34589 1727204127.47757: ^ task is: None 34589 1727204127.47758: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204127.47926: in VariableManager get_vars() 34589 1727204127.47951: done with get_vars() 34589 1727204127.47957: in VariableManager get_vars() 34589 1727204127.47980: done with get_vars() 34589 1727204127.47986: variable 'omit' from source: magic vars 34589 1727204127.48108: variable 'profile' from source: play vars 34589 1727204127.48206: in VariableManager get_vars() 34589 1727204127.48220: done with get_vars() 34589 1727204127.48241: variable 'omit' from source: magic vars 34589 1727204127.48308: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 34589 1727204127.48998: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 34589 1727204127.49021: getting the remaining hosts for this loop 34589 1727204127.49023: done getting the remaining hosts for this loop 34589 1727204127.49025: getting the next task for host managed-node1 34589 1727204127.49028: done getting next task for host managed-node1 34589 1727204127.49030: ^ task is: TASK: Gathering Facts 34589 1727204127.49031: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204127.49033: getting variables 34589 1727204127.49034: in VariableManager get_vars() 34589 1727204127.49044: Calling all_inventory to load vars for managed-node1 34589 1727204127.49047: Calling groups_inventory to load vars for managed-node1 34589 1727204127.49048: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204127.49054: Calling all_plugins_play to load vars for managed-node1 34589 1727204127.49056: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204127.49059: Calling groups_plugins_play to load vars for managed-node1 34589 1727204127.50261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204127.51819: done with get_vars() 34589 1727204127.51844: done getting variables 34589 1727204127.51894: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Tuesday 24 September 2024 14:55:27 -0400 (0:00:00.593) 0:00:27.654 ***** 34589 1727204127.51919: entering _queue_task() for managed-node1/gather_facts 34589 1727204127.52255: worker is 1 (out of 1 available) 34589 1727204127.52266: exiting _queue_task() for managed-node1/gather_facts 34589 1727204127.52280: done queuing things up, now waiting for results queue to drain 34589 1727204127.52282: waiting for pending results... 34589 1727204127.52597: running TaskExecutor() for managed-node1/TASK: Gathering Facts 34589 1727204127.52695: in run() - task 028d2410-947f-a9c6-cddc-000000000521 34589 1727204127.52718: variable 'ansible_search_path' from source: unknown 34589 1727204127.52762: calling self._execute() 34589 1727204127.52880: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204127.52893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204127.52909: variable 'omit' from source: magic vars 34589 1727204127.53313: variable 'ansible_distribution_major_version' from source: facts 34589 1727204127.53332: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204127.53343: variable 'omit' from source: magic vars 34589 1727204127.53382: variable 'omit' from source: magic vars 34589 1727204127.53424: variable 'omit' from source: magic vars 34589 1727204127.53472: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204127.53527: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204127.53553: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204127.53578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204127.53595: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204127.53632: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204127.53641: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204127.53649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204127.53758: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204127.53770: Set connection var ansible_shell_executable to /bin/sh 34589 1727204127.53788: Set connection var ansible_timeout to 10 34589 1727204127.53795: Set connection var ansible_shell_type to sh 34589 1727204127.53806: Set connection var ansible_connection to ssh 34589 1727204127.53815: Set connection var ansible_pipelining to False 34589 1727204127.53847: variable 'ansible_shell_executable' from source: unknown 34589 1727204127.53855: variable 'ansible_connection' from source: unknown 34589 1727204127.53863: variable 'ansible_module_compression' from source: unknown 34589 1727204127.53870: variable 'ansible_shell_type' from source: unknown 34589 1727204127.53880: variable 'ansible_shell_executable' from source: unknown 34589 1727204127.53888: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204127.53896: variable 'ansible_pipelining' from source: unknown 34589 1727204127.53904: variable 'ansible_timeout' from source: unknown 34589 1727204127.53952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204127.54108: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204127.54125: variable 'omit' from source: magic vars 34589 1727204127.54135: starting attempt loop 34589 1727204127.54141: running the handler 34589 1727204127.54167: variable 'ansible_facts' from source: unknown 34589 1727204127.54198: _low_level_execute_command(): starting 34589 1727204127.54282: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204127.54997: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204127.55059: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204127.55095: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204127.55212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204127.57018: stdout chunk (state=3): >>>/root <<< 34589 1727204127.57164: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204127.57178: stdout chunk (state=3): >>><<< 34589 1727204127.57483: stderr chunk (state=3): >>><<< 34589 1727204127.57487: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204127.57490: _low_level_execute_command(): starting 34589 1727204127.57492: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204127.5725381-37300-18495545954810 `" && echo ansible-tmp-1727204127.5725381-37300-18495545954810="` echo /root/.ansible/tmp/ansible-tmp-1727204127.5725381-37300-18495545954810 `" ) && sleep 0' 34589 1727204127.58614: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204127.58630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 34589 1727204127.58642: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204127.58852: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204127.58906: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204127.58953: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204127.59105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204127.61265: stdout chunk (state=3): >>>ansible-tmp-1727204127.5725381-37300-18495545954810=/root/.ansible/tmp/ansible-tmp-1727204127.5725381-37300-18495545954810 <<< 34589 1727204127.61423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204127.61440: stdout chunk (state=3): >>><<< 34589 1727204127.61459: stderr chunk (state=3): >>><<< 34589 1727204127.61652: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204127.5725381-37300-18495545954810=/root/.ansible/tmp/ansible-tmp-1727204127.5725381-37300-18495545954810 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204127.61656: variable 'ansible_module_compression' from source: unknown 34589 1727204127.61781: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 34589 1727204127.61841: variable 'ansible_facts' from source: unknown 34589 1727204127.62314: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204127.5725381-37300-18495545954810/AnsiballZ_setup.py 34589 1727204127.62733: Sending initial data 34589 1727204127.62743: Sent initial data (153 bytes) 34589 1727204127.63983: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204127.64005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204127.64063: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204127.64157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204127.64274: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204127.64376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204127.66153: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204127.66280: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204127.66356: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmps4oo88fr /root/.ansible/tmp/ansible-tmp-1727204127.5725381-37300-18495545954810/AnsiballZ_setup.py <<< 34589 1727204127.66366: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204127.5725381-37300-18495545954810/AnsiballZ_setup.py" <<< 34589 1727204127.66453: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmps4oo88fr" to remote "/root/.ansible/tmp/ansible-tmp-1727204127.5725381-37300-18495545954810/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204127.5725381-37300-18495545954810/AnsiballZ_setup.py" <<< 34589 1727204127.69087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204127.69266: stderr chunk (state=3): >>><<< 34589 1727204127.69270: stdout chunk (state=3): >>><<< 34589 1727204127.69274: done transferring module to remote 34589 1727204127.69279: _low_level_execute_command(): starting 34589 1727204127.69281: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204127.5725381-37300-18495545954810/ /root/.ansible/tmp/ansible-tmp-1727204127.5725381-37300-18495545954810/AnsiballZ_setup.py && sleep 0' 34589 1727204127.70713: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204127.70717: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204127.70736: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204127.70847: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204127.72864: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204127.72886: stdout chunk (state=3): >>><<< 34589 1727204127.72910: stderr chunk (state=3): >>><<< 34589 1727204127.72958: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204127.73008: _low_level_execute_command(): starting 34589 1727204127.73013: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204127.5725381-37300-18495545954810/AnsiballZ_setup.py && sleep 0' 34589 1727204127.73761: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204127.74140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204127.74145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204127.74190: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204127.74260: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204127.74326: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204127.74593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204128.43971: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_local": {}, "ansible_loadavg": {"1m": 0.677734375, "5m": 0.5380859375, "15m": 0.28662109375}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["ethtest0", "lo", "eth0", "peerethtest0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_sta<<< 34589 1727204128.44004: stdout chunk (state=3): >>>g_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "16:ab:3d:8e:44:05", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::14ab:3dff:fe8e:4405", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "12:9d:30:6d:a8:93", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::109d:30ff:fe6d:a893", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5", "fe80::14ab:3dff:fe8e:4405", "fe80::109d:30ff:fe6d:a893"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5", "fe80::109d:30ff:fe6d:a893", "fe80::14ab:3dff:fe8e:4405"]}, "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2919, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 612, "free": 2919}, "nocache": {"free": 3277, "used": 254}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis<<< 34589 1727204128.44025: stdout chunk (state=3): >>>_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 719, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785325568, "block_size": 4096, "block_total": 65519099, "block_available": 63912433, "block_used": 1606666, "inode_total": 131070960, "inode_available": 131027259, "inode_used": 43701, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "28", "epoch": "1727204128", "epoch_int": "1727204128", "date": "2024-09-24", "time": "14:55:28", "iso8601_micro": "2024-09-24T18:55:28.434962Z", "iso8601": "2024-09-24T18:55:28Z", "iso8601_basic": "20240924T145528434962", "iso8601_basic_short": "20240924T145528", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_fips": false, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 34589 1727204128.46521: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204128.46525: stdout chunk (state=3): >>><<< 34589 1727204128.46527: stderr chunk (state=3): >>><<< 34589 1727204128.46586: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_local": {}, "ansible_loadavg": {"1m": 0.677734375, "5m": 0.5380859375, "15m": 0.28662109375}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["ethtest0", "lo", "eth0", "peerethtest0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "16:ab:3d:8e:44:05", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::14ab:3dff:fe8e:4405", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "12:9d:30:6d:a8:93", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::109d:30ff:fe6d:a893", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5", "fe80::14ab:3dff:fe8e:4405", "fe80::109d:30ff:fe6d:a893"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5", "fe80::109d:30ff:fe6d:a893", "fe80::14ab:3dff:fe8e:4405"]}, "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2919, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 612, "free": 2919}, "nocache": {"free": 3277, "used": 254}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 719, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785325568, "block_size": 4096, "block_total": 65519099, "block_available": 63912433, "block_used": 1606666, "inode_total": 131070960, "inode_available": 131027259, "inode_used": 43701, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "28", "epoch": "1727204128", "epoch_int": "1727204128", "date": "2024-09-24", "time": "14:55:28", "iso8601_micro": "2024-09-24T18:55:28.434962Z", "iso8601": "2024-09-24T18:55:28Z", "iso8601_basic": "20240924T145528434962", "iso8601_basic_short": "20240924T145528", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_fips": false, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204128.47067: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204127.5725381-37300-18495545954810/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204128.47179: _low_level_execute_command(): starting 34589 1727204128.47183: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204127.5725381-37300-18495545954810/ > /dev/null 2>&1 && sleep 0' 34589 1727204128.47738: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204128.47752: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204128.47766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204128.47787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204128.47809: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204128.47822: stderr chunk (state=3): >>>debug2: match not found <<< 34589 1727204128.47835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204128.47931: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204128.47953: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204128.48064: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204128.50147: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204128.50158: stdout chunk (state=3): >>><<< 34589 1727204128.50170: stderr chunk (state=3): >>><<< 34589 1727204128.50192: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204128.50209: handler run complete 34589 1727204128.50367: variable 'ansible_facts' from source: unknown 34589 1727204128.50484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204128.50849: variable 'ansible_facts' from source: unknown 34589 1727204128.50951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204128.51125: attempt loop complete, returning result 34589 1727204128.51135: _execute() done 34589 1727204128.51143: dumping result to json 34589 1727204128.51183: done dumping result, returning 34589 1727204128.51194: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [028d2410-947f-a9c6-cddc-000000000521] 34589 1727204128.51200: sending task result for task 028d2410-947f-a9c6-cddc-000000000521 ok: [managed-node1] 34589 1727204128.52454: no more pending results, returning what we have 34589 1727204128.52457: results queue empty 34589 1727204128.52458: checking for any_errors_fatal 34589 1727204128.52459: done checking for any_errors_fatal 34589 1727204128.52460: checking for max_fail_percentage 34589 1727204128.52462: done checking for max_fail_percentage 34589 1727204128.52462: checking to see if all hosts have failed and the running result is not ok 34589 1727204128.52463: done checking to see if all hosts have failed 34589 1727204128.52464: getting the remaining hosts for this loop 34589 1727204128.52465: done getting the remaining hosts for this loop 34589 1727204128.52468: getting the next task for host managed-node1 34589 1727204128.52473: done getting next task for host managed-node1 34589 1727204128.52475: ^ task is: TASK: meta (flush_handlers) 34589 1727204128.52481: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204128.52486: getting variables 34589 1727204128.52487: in VariableManager get_vars() 34589 1727204128.52518: Calling all_inventory to load vars for managed-node1 34589 1727204128.52522: Calling groups_inventory to load vars for managed-node1 34589 1727204128.52524: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204128.52531: done sending task result for task 028d2410-947f-a9c6-cddc-000000000521 34589 1727204128.52534: WORKER PROCESS EXITING 34589 1727204128.52543: Calling all_plugins_play to load vars for managed-node1 34589 1727204128.52546: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204128.52549: Calling groups_plugins_play to load vars for managed-node1 34589 1727204128.53915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204128.55528: done with get_vars() 34589 1727204128.55555: done getting variables 34589 1727204128.55631: in VariableManager get_vars() 34589 1727204128.55645: Calling all_inventory to load vars for managed-node1 34589 1727204128.55648: Calling groups_inventory to load vars for managed-node1 34589 1727204128.55650: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204128.55655: Calling all_plugins_play to load vars for managed-node1 34589 1727204128.55658: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204128.55661: Calling groups_plugins_play to load vars for managed-node1 34589 1727204128.56824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204128.58484: done with get_vars() 34589 1727204128.58511: done queuing things up, now waiting for results queue to drain 34589 1727204128.58513: results queue empty 34589 1727204128.58514: checking for any_errors_fatal 34589 1727204128.58518: done checking for any_errors_fatal 34589 1727204128.58523: checking for max_fail_percentage 34589 1727204128.58524: done checking for max_fail_percentage 34589 1727204128.58525: checking to see if all hosts have failed and the running result is not ok 34589 1727204128.58526: done checking to see if all hosts have failed 34589 1727204128.58526: getting the remaining hosts for this loop 34589 1727204128.58527: done getting the remaining hosts for this loop 34589 1727204128.58530: getting the next task for host managed-node1 34589 1727204128.58534: done getting next task for host managed-node1 34589 1727204128.58536: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34589 1727204128.58538: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204128.58547: getting variables 34589 1727204128.58548: in VariableManager get_vars() 34589 1727204128.58562: Calling all_inventory to load vars for managed-node1 34589 1727204128.58564: Calling groups_inventory to load vars for managed-node1 34589 1727204128.58566: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204128.58571: Calling all_plugins_play to load vars for managed-node1 34589 1727204128.58573: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204128.58577: Calling groups_plugins_play to load vars for managed-node1 34589 1727204128.59748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204128.61330: done with get_vars() 34589 1727204128.61348: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:55:28 -0400 (0:00:01.095) 0:00:28.749 ***** 34589 1727204128.61426: entering _queue_task() for managed-node1/include_tasks 34589 1727204128.61805: worker is 1 (out of 1 available) 34589 1727204128.61820: exiting _queue_task() for managed-node1/include_tasks 34589 1727204128.61832: done queuing things up, now waiting for results queue to drain 34589 1727204128.61833: waiting for pending results... 34589 1727204128.62143: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34589 1727204128.62271: in run() - task 028d2410-947f-a9c6-cddc-000000000084 34589 1727204128.62302: variable 'ansible_search_path' from source: unknown 34589 1727204128.62314: variable 'ansible_search_path' from source: unknown 34589 1727204128.62356: calling self._execute() 34589 1727204128.62467: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204128.62483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204128.62499: variable 'omit' from source: magic vars 34589 1727204128.62891: variable 'ansible_distribution_major_version' from source: facts 34589 1727204128.62910: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204128.62922: _execute() done 34589 1727204128.62931: dumping result to json 34589 1727204128.62939: done dumping result, returning 34589 1727204128.63060: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [028d2410-947f-a9c6-cddc-000000000084] 34589 1727204128.63063: sending task result for task 028d2410-947f-a9c6-cddc-000000000084 34589 1727204128.63140: done sending task result for task 028d2410-947f-a9c6-cddc-000000000084 34589 1727204128.63143: WORKER PROCESS EXITING 34589 1727204128.63186: no more pending results, returning what we have 34589 1727204128.63191: in VariableManager get_vars() 34589 1727204128.63235: Calling all_inventory to load vars for managed-node1 34589 1727204128.63238: Calling groups_inventory to load vars for managed-node1 34589 1727204128.63241: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204128.63253: Calling all_plugins_play to load vars for managed-node1 34589 1727204128.63256: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204128.63258: Calling groups_plugins_play to load vars for managed-node1 34589 1727204128.64921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204128.66561: done with get_vars() 34589 1727204128.66582: variable 'ansible_search_path' from source: unknown 34589 1727204128.66584: variable 'ansible_search_path' from source: unknown 34589 1727204128.66614: we have included files to process 34589 1727204128.66615: generating all_blocks data 34589 1727204128.66617: done generating all_blocks data 34589 1727204128.66617: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 34589 1727204128.66618: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 34589 1727204128.66621: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 34589 1727204128.67181: done processing included file 34589 1727204128.67184: iterating over new_blocks loaded from include file 34589 1727204128.67185: in VariableManager get_vars() 34589 1727204128.67210: done with get_vars() 34589 1727204128.67211: filtering new block on tags 34589 1727204128.67230: done filtering new block on tags 34589 1727204128.67232: in VariableManager get_vars() 34589 1727204128.67252: done with get_vars() 34589 1727204128.67254: filtering new block on tags 34589 1727204128.67272: done filtering new block on tags 34589 1727204128.67275: in VariableManager get_vars() 34589 1727204128.67296: done with get_vars() 34589 1727204128.67298: filtering new block on tags 34589 1727204128.67316: done filtering new block on tags 34589 1727204128.67318: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 34589 1727204128.67324: extending task lists for all hosts with included blocks 34589 1727204128.67710: done extending task lists 34589 1727204128.67711: done processing included files 34589 1727204128.67712: results queue empty 34589 1727204128.67713: checking for any_errors_fatal 34589 1727204128.67715: done checking for any_errors_fatal 34589 1727204128.67715: checking for max_fail_percentage 34589 1727204128.67716: done checking for max_fail_percentage 34589 1727204128.67717: checking to see if all hosts have failed and the running result is not ok 34589 1727204128.67718: done checking to see if all hosts have failed 34589 1727204128.67719: getting the remaining hosts for this loop 34589 1727204128.67720: done getting the remaining hosts for this loop 34589 1727204128.67722: getting the next task for host managed-node1 34589 1727204128.67726: done getting next task for host managed-node1 34589 1727204128.67729: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 34589 1727204128.67731: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204128.67740: getting variables 34589 1727204128.67741: in VariableManager get_vars() 34589 1727204128.67754: Calling all_inventory to load vars for managed-node1 34589 1727204128.67756: Calling groups_inventory to load vars for managed-node1 34589 1727204128.67758: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204128.67763: Calling all_plugins_play to load vars for managed-node1 34589 1727204128.67766: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204128.67769: Calling groups_plugins_play to load vars for managed-node1 34589 1727204128.68949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204128.70237: done with get_vars() 34589 1727204128.70254: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:55:28 -0400 (0:00:00.088) 0:00:28.838 ***** 34589 1727204128.70313: entering _queue_task() for managed-node1/setup 34589 1727204128.70580: worker is 1 (out of 1 available) 34589 1727204128.70594: exiting _queue_task() for managed-node1/setup 34589 1727204128.70606: done queuing things up, now waiting for results queue to drain 34589 1727204128.70610: waiting for pending results... 34589 1727204128.70789: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 34589 1727204128.70877: in run() - task 028d2410-947f-a9c6-cddc-000000000562 34589 1727204128.70889: variable 'ansible_search_path' from source: unknown 34589 1727204128.70893: variable 'ansible_search_path' from source: unknown 34589 1727204128.70922: calling self._execute() 34589 1727204128.70998: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204128.71002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204128.71057: variable 'omit' from source: magic vars 34589 1727204128.71282: variable 'ansible_distribution_major_version' from source: facts 34589 1727204128.71292: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204128.71436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204128.73306: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204128.73350: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204128.73378: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204128.73409: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204128.73426: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204128.73485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204128.73513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204128.73528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204128.73554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204128.73565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204128.73603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204128.73624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204128.73642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204128.73667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204128.73679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204128.73790: variable '__network_required_facts' from source: role '' defaults 34589 1727204128.73798: variable 'ansible_facts' from source: unknown 34589 1727204128.74225: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 34589 1727204128.74229: when evaluation is False, skipping this task 34589 1727204128.74232: _execute() done 34589 1727204128.74234: dumping result to json 34589 1727204128.74237: done dumping result, returning 34589 1727204128.74243: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [028d2410-947f-a9c6-cddc-000000000562] 34589 1727204128.74248: sending task result for task 028d2410-947f-a9c6-cddc-000000000562 34589 1727204128.74333: done sending task result for task 028d2410-947f-a9c6-cddc-000000000562 34589 1727204128.74335: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34589 1727204128.74414: no more pending results, returning what we have 34589 1727204128.74419: results queue empty 34589 1727204128.74420: checking for any_errors_fatal 34589 1727204128.74421: done checking for any_errors_fatal 34589 1727204128.74421: checking for max_fail_percentage 34589 1727204128.74423: done checking for max_fail_percentage 34589 1727204128.74424: checking to see if all hosts have failed and the running result is not ok 34589 1727204128.74425: done checking to see if all hosts have failed 34589 1727204128.74425: getting the remaining hosts for this loop 34589 1727204128.74427: done getting the remaining hosts for this loop 34589 1727204128.74430: getting the next task for host managed-node1 34589 1727204128.74438: done getting next task for host managed-node1 34589 1727204128.74442: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 34589 1727204128.74446: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204128.74459: getting variables 34589 1727204128.74460: in VariableManager get_vars() 34589 1727204128.74496: Calling all_inventory to load vars for managed-node1 34589 1727204128.74498: Calling groups_inventory to load vars for managed-node1 34589 1727204128.74500: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204128.74511: Calling all_plugins_play to load vars for managed-node1 34589 1727204128.74514: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204128.74516: Calling groups_plugins_play to load vars for managed-node1 34589 1727204128.75768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204128.76882: done with get_vars() 34589 1727204128.76899: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:55:28 -0400 (0:00:00.066) 0:00:28.904 ***** 34589 1727204128.76972: entering _queue_task() for managed-node1/stat 34589 1727204128.77225: worker is 1 (out of 1 available) 34589 1727204128.77239: exiting _queue_task() for managed-node1/stat 34589 1727204128.77251: done queuing things up, now waiting for results queue to drain 34589 1727204128.77253: waiting for pending results... 34589 1727204128.77435: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 34589 1727204128.77533: in run() - task 028d2410-947f-a9c6-cddc-000000000564 34589 1727204128.77545: variable 'ansible_search_path' from source: unknown 34589 1727204128.77549: variable 'ansible_search_path' from source: unknown 34589 1727204128.77577: calling self._execute() 34589 1727204128.77652: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204128.77656: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204128.77664: variable 'omit' from source: magic vars 34589 1727204128.77966: variable 'ansible_distribution_major_version' from source: facts 34589 1727204128.77988: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204128.78139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204128.78410: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204128.78490: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204128.78511: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204128.78550: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204128.78683: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204128.78688: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204128.78705: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204128.78735: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204128.78819: variable '__network_is_ostree' from source: set_fact 34589 1727204128.78825: Evaluated conditional (not __network_is_ostree is defined): False 34589 1727204128.78828: when evaluation is False, skipping this task 34589 1727204128.78830: _execute() done 34589 1727204128.78837: dumping result to json 34589 1727204128.78839: done dumping result, returning 34589 1727204128.78842: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [028d2410-947f-a9c6-cddc-000000000564] 34589 1727204128.78844: sending task result for task 028d2410-947f-a9c6-cddc-000000000564 34589 1727204128.78939: done sending task result for task 028d2410-947f-a9c6-cddc-000000000564 34589 1727204128.78941: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 34589 1727204128.79012: no more pending results, returning what we have 34589 1727204128.79016: results queue empty 34589 1727204128.79016: checking for any_errors_fatal 34589 1727204128.79022: done checking for any_errors_fatal 34589 1727204128.79023: checking for max_fail_percentage 34589 1727204128.79025: done checking for max_fail_percentage 34589 1727204128.79026: checking to see if all hosts have failed and the running result is not ok 34589 1727204128.79027: done checking to see if all hosts have failed 34589 1727204128.79028: getting the remaining hosts for this loop 34589 1727204128.79029: done getting the remaining hosts for this loop 34589 1727204128.79032: getting the next task for host managed-node1 34589 1727204128.79040: done getting next task for host managed-node1 34589 1727204128.79044: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 34589 1727204128.79046: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204128.79059: getting variables 34589 1727204128.79060: in VariableManager get_vars() 34589 1727204128.79094: Calling all_inventory to load vars for managed-node1 34589 1727204128.79097: Calling groups_inventory to load vars for managed-node1 34589 1727204128.79099: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204128.79108: Calling all_plugins_play to load vars for managed-node1 34589 1727204128.79110: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204128.79113: Calling groups_plugins_play to load vars for managed-node1 34589 1727204128.80358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204128.81335: done with get_vars() 34589 1727204128.81350: done getting variables 34589 1727204128.81394: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:55:28 -0400 (0:00:00.044) 0:00:28.949 ***** 34589 1727204128.81419: entering _queue_task() for managed-node1/set_fact 34589 1727204128.81651: worker is 1 (out of 1 available) 34589 1727204128.81664: exiting _queue_task() for managed-node1/set_fact 34589 1727204128.81679: done queuing things up, now waiting for results queue to drain 34589 1727204128.81680: waiting for pending results... 34589 1727204128.81855: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 34589 1727204128.81939: in run() - task 028d2410-947f-a9c6-cddc-000000000565 34589 1727204128.81950: variable 'ansible_search_path' from source: unknown 34589 1727204128.81953: variable 'ansible_search_path' from source: unknown 34589 1727204128.81979: calling self._execute() 34589 1727204128.82059: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204128.82063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204128.82072: variable 'omit' from source: magic vars 34589 1727204128.82497: variable 'ansible_distribution_major_version' from source: facts 34589 1727204128.82501: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204128.82651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204128.82950: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204128.82999: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204128.83048: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204128.83139: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204128.83176: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204128.83225: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204128.83238: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204128.83263: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204128.83328: variable '__network_is_ostree' from source: set_fact 34589 1727204128.83334: Evaluated conditional (not __network_is_ostree is defined): False 34589 1727204128.83337: when evaluation is False, skipping this task 34589 1727204128.83339: _execute() done 34589 1727204128.83342: dumping result to json 34589 1727204128.83347: done dumping result, returning 34589 1727204128.83354: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [028d2410-947f-a9c6-cddc-000000000565] 34589 1727204128.83365: sending task result for task 028d2410-947f-a9c6-cddc-000000000565 34589 1727204128.83444: done sending task result for task 028d2410-947f-a9c6-cddc-000000000565 34589 1727204128.83447: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 34589 1727204128.83519: no more pending results, returning what we have 34589 1727204128.83523: results queue empty 34589 1727204128.83523: checking for any_errors_fatal 34589 1727204128.83529: done checking for any_errors_fatal 34589 1727204128.83530: checking for max_fail_percentage 34589 1727204128.83531: done checking for max_fail_percentage 34589 1727204128.83532: checking to see if all hosts have failed and the running result is not ok 34589 1727204128.83533: done checking to see if all hosts have failed 34589 1727204128.83534: getting the remaining hosts for this loop 34589 1727204128.83535: done getting the remaining hosts for this loop 34589 1727204128.83539: getting the next task for host managed-node1 34589 1727204128.83546: done getting next task for host managed-node1 34589 1727204128.83550: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 34589 1727204128.83552: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204128.83564: getting variables 34589 1727204128.83566: in VariableManager get_vars() 34589 1727204128.83599: Calling all_inventory to load vars for managed-node1 34589 1727204128.83602: Calling groups_inventory to load vars for managed-node1 34589 1727204128.83604: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204128.83615: Calling all_plugins_play to load vars for managed-node1 34589 1727204128.83617: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204128.83620: Calling groups_plugins_play to load vars for managed-node1 34589 1727204128.84405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204128.85286: done with get_vars() 34589 1727204128.85304: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:55:28 -0400 (0:00:00.039) 0:00:28.988 ***** 34589 1727204128.85374: entering _queue_task() for managed-node1/service_facts 34589 1727204128.85636: worker is 1 (out of 1 available) 34589 1727204128.85650: exiting _queue_task() for managed-node1/service_facts 34589 1727204128.85662: done queuing things up, now waiting for results queue to drain 34589 1727204128.85664: waiting for pending results... 34589 1727204128.85842: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 34589 1727204128.85924: in run() - task 028d2410-947f-a9c6-cddc-000000000567 34589 1727204128.85936: variable 'ansible_search_path' from source: unknown 34589 1727204128.85940: variable 'ansible_search_path' from source: unknown 34589 1727204128.85967: calling self._execute() 34589 1727204128.86044: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204128.86048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204128.86056: variable 'omit' from source: magic vars 34589 1727204128.86337: variable 'ansible_distribution_major_version' from source: facts 34589 1727204128.86343: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204128.86350: variable 'omit' from source: magic vars 34589 1727204128.86387: variable 'omit' from source: magic vars 34589 1727204128.86445: variable 'omit' from source: magic vars 34589 1727204128.86449: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204128.86471: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204128.86488: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204128.86502: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204128.86511: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204128.86538: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204128.86541: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204128.86545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204128.86618: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204128.86621: Set connection var ansible_shell_executable to /bin/sh 34589 1727204128.86629: Set connection var ansible_timeout to 10 34589 1727204128.86632: Set connection var ansible_shell_type to sh 34589 1727204128.86637: Set connection var ansible_connection to ssh 34589 1727204128.86642: Set connection var ansible_pipelining to False 34589 1727204128.86665: variable 'ansible_shell_executable' from source: unknown 34589 1727204128.86669: variable 'ansible_connection' from source: unknown 34589 1727204128.86671: variable 'ansible_module_compression' from source: unknown 34589 1727204128.86674: variable 'ansible_shell_type' from source: unknown 34589 1727204128.86678: variable 'ansible_shell_executable' from source: unknown 34589 1727204128.86680: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204128.86682: variable 'ansible_pipelining' from source: unknown 34589 1727204128.86684: variable 'ansible_timeout' from source: unknown 34589 1727204128.86686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204128.86827: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34589 1727204128.86836: variable 'omit' from source: magic vars 34589 1727204128.86841: starting attempt loop 34589 1727204128.86843: running the handler 34589 1727204128.86855: _low_level_execute_command(): starting 34589 1727204128.86863: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204128.87381: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204128.87385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204128.87388: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204128.87390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204128.87446: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204128.87449: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204128.87451: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204128.87540: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204128.89332: stdout chunk (state=3): >>>/root <<< 34589 1727204128.89433: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204128.89471: stderr chunk (state=3): >>><<< 34589 1727204128.89473: stdout chunk (state=3): >>><<< 34589 1727204128.89494: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204128.89509: _low_level_execute_command(): starting 34589 1727204128.89518: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204128.8949203-37344-132525012805797 `" && echo ansible-tmp-1727204128.8949203-37344-132525012805797="` echo /root/.ansible/tmp/ansible-tmp-1727204128.8949203-37344-132525012805797 `" ) && sleep 0' 34589 1727204128.90324: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204128.90328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204128.90330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34589 1727204128.90341: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 34589 1727204128.90345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204128.90393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204128.90474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204128.92647: stdout chunk (state=3): >>>ansible-tmp-1727204128.8949203-37344-132525012805797=/root/.ansible/tmp/ansible-tmp-1727204128.8949203-37344-132525012805797 <<< 34589 1727204128.92781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204128.92793: stderr chunk (state=3): >>><<< 34589 1727204128.92796: stdout chunk (state=3): >>><<< 34589 1727204128.92855: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204128.8949203-37344-132525012805797=/root/.ansible/tmp/ansible-tmp-1727204128.8949203-37344-132525012805797 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204128.92859: variable 'ansible_module_compression' from source: unknown 34589 1727204128.92887: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 34589 1727204128.92916: variable 'ansible_facts' from source: unknown 34589 1727204128.92971: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204128.8949203-37344-132525012805797/AnsiballZ_service_facts.py 34589 1727204128.93073: Sending initial data 34589 1727204128.93078: Sent initial data (162 bytes) 34589 1727204128.93523: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204128.93526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204128.93529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204128.93531: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204128.93533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204128.93581: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204128.93590: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204128.93679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204128.95481: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 34589 1727204128.95484: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204128.95549: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204128.95628: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmp37mhmmr4 /root/.ansible/tmp/ansible-tmp-1727204128.8949203-37344-132525012805797/AnsiballZ_service_facts.py <<< 34589 1727204128.95631: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204128.8949203-37344-132525012805797/AnsiballZ_service_facts.py" <<< 34589 1727204128.95708: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmp37mhmmr4" to remote "/root/.ansible/tmp/ansible-tmp-1727204128.8949203-37344-132525012805797/AnsiballZ_service_facts.py" <<< 34589 1727204128.95710: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204128.8949203-37344-132525012805797/AnsiballZ_service_facts.py" <<< 34589 1727204128.96399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204128.96446: stderr chunk (state=3): >>><<< 34589 1727204128.96451: stdout chunk (state=3): >>><<< 34589 1727204128.96502: done transferring module to remote 34589 1727204128.96514: _low_level_execute_command(): starting 34589 1727204128.96518: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204128.8949203-37344-132525012805797/ /root/.ansible/tmp/ansible-tmp-1727204128.8949203-37344-132525012805797/AnsiballZ_service_facts.py && sleep 0' 34589 1727204128.96969: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204128.96974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204128.96979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204128.96982: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204128.96988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204128.97043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204128.97046: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204128.97048: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204128.97125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204128.99157: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204128.99183: stderr chunk (state=3): >>><<< 34589 1727204128.99187: stdout chunk (state=3): >>><<< 34589 1727204128.99203: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204128.99205: _low_level_execute_command(): starting 34589 1727204128.99211: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204128.8949203-37344-132525012805797/AnsiballZ_service_facts.py && sleep 0' 34589 1727204128.99719: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204128.99723: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204128.99774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204128.99862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204130.78265: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "stat<<< 34589 1727204130.78329: stdout chunk (state=3): >>>us": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 34589 1727204130.80184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204130.80188: stdout chunk (state=3): >>><<< 34589 1727204130.80382: stderr chunk (state=3): >>><<< 34589 1727204130.80388: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204130.81273: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204128.8949203-37344-132525012805797/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204130.81301: _low_level_execute_command(): starting 34589 1727204130.81315: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204128.8949203-37344-132525012805797/ > /dev/null 2>&1 && sleep 0' 34589 1727204130.81960: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204130.81979: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204130.81993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204130.82013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204130.82057: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204130.82127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204130.82146: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204130.82177: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204130.82296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204130.84364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204130.84368: stdout chunk (state=3): >>><<< 34589 1727204130.84370: stderr chunk (state=3): >>><<< 34589 1727204130.84389: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204130.84582: handler run complete 34589 1727204130.84614: variable 'ansible_facts' from source: unknown 34589 1727204130.84981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204130.85486: variable 'ansible_facts' from source: unknown 34589 1727204130.85632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204130.85854: attempt loop complete, returning result 34589 1727204130.85864: _execute() done 34589 1727204130.85871: dumping result to json 34589 1727204130.85946: done dumping result, returning 34589 1727204130.85960: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [028d2410-947f-a9c6-cddc-000000000567] 34589 1727204130.85968: sending task result for task 028d2410-947f-a9c6-cddc-000000000567 34589 1727204130.87406: done sending task result for task 028d2410-947f-a9c6-cddc-000000000567 34589 1727204130.87410: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34589 1727204130.87522: no more pending results, returning what we have 34589 1727204130.87525: results queue empty 34589 1727204130.87525: checking for any_errors_fatal 34589 1727204130.87529: done checking for any_errors_fatal 34589 1727204130.87531: checking for max_fail_percentage 34589 1727204130.87533: done checking for max_fail_percentage 34589 1727204130.87534: checking to see if all hosts have failed and the running result is not ok 34589 1727204130.87535: done checking to see if all hosts have failed 34589 1727204130.87536: getting the remaining hosts for this loop 34589 1727204130.87537: done getting the remaining hosts for this loop 34589 1727204130.87540: getting the next task for host managed-node1 34589 1727204130.87546: done getting next task for host managed-node1 34589 1727204130.87548: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 34589 1727204130.87551: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204130.87560: getting variables 34589 1727204130.87561: in VariableManager get_vars() 34589 1727204130.87591: Calling all_inventory to load vars for managed-node1 34589 1727204130.87594: Calling groups_inventory to load vars for managed-node1 34589 1727204130.87596: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204130.87604: Calling all_plugins_play to load vars for managed-node1 34589 1727204130.87607: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204130.87610: Calling groups_plugins_play to load vars for managed-node1 34589 1727204130.88832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204130.90450: done with get_vars() 34589 1727204130.90485: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:55:30 -0400 (0:00:02.052) 0:00:31.040 ***** 34589 1727204130.90587: entering _queue_task() for managed-node1/package_facts 34589 1727204130.91100: worker is 1 (out of 1 available) 34589 1727204130.91112: exiting _queue_task() for managed-node1/package_facts 34589 1727204130.91123: done queuing things up, now waiting for results queue to drain 34589 1727204130.91125: waiting for pending results... 34589 1727204130.91367: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 34589 1727204130.91458: in run() - task 028d2410-947f-a9c6-cddc-000000000568 34589 1727204130.91581: variable 'ansible_search_path' from source: unknown 34589 1727204130.91585: variable 'ansible_search_path' from source: unknown 34589 1727204130.91589: calling self._execute() 34589 1727204130.91643: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204130.91654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204130.91668: variable 'omit' from source: magic vars 34589 1727204130.92056: variable 'ansible_distribution_major_version' from source: facts 34589 1727204130.92073: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204130.92086: variable 'omit' from source: magic vars 34589 1727204130.92157: variable 'omit' from source: magic vars 34589 1727204130.92197: variable 'omit' from source: magic vars 34589 1727204130.92281: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204130.92290: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204130.92314: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204130.92342: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204130.92357: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204130.92390: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204130.92444: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204130.92447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204130.92516: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204130.92526: Set connection var ansible_shell_executable to /bin/sh 34589 1727204130.92538: Set connection var ansible_timeout to 10 34589 1727204130.92549: Set connection var ansible_shell_type to sh 34589 1727204130.92564: Set connection var ansible_connection to ssh 34589 1727204130.92572: Set connection var ansible_pipelining to False 34589 1727204130.92601: variable 'ansible_shell_executable' from source: unknown 34589 1727204130.92609: variable 'ansible_connection' from source: unknown 34589 1727204130.92662: variable 'ansible_module_compression' from source: unknown 34589 1727204130.92665: variable 'ansible_shell_type' from source: unknown 34589 1727204130.92667: variable 'ansible_shell_executable' from source: unknown 34589 1727204130.92669: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204130.92671: variable 'ansible_pipelining' from source: unknown 34589 1727204130.92672: variable 'ansible_timeout' from source: unknown 34589 1727204130.92674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204130.92855: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34589 1727204130.92878: variable 'omit' from source: magic vars 34589 1727204130.92891: starting attempt loop 34589 1727204130.92897: running the handler 34589 1727204130.92916: _low_level_execute_command(): starting 34589 1727204130.92981: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204130.93702: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204130.93782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204130.93825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204130.93912: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204130.95712: stdout chunk (state=3): >>>/root <<< 34589 1727204130.95867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204130.95870: stdout chunk (state=3): >>><<< 34589 1727204130.95872: stderr chunk (state=3): >>><<< 34589 1727204130.95984: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204130.95988: _low_level_execute_command(): starting 34589 1727204130.95990: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204130.9589496-37407-194958351488558 `" && echo ansible-tmp-1727204130.9589496-37407-194958351488558="` echo /root/.ansible/tmp/ansible-tmp-1727204130.9589496-37407-194958351488558 `" ) && sleep 0' 34589 1727204130.96597: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204130.96721: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204130.96742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204130.96785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204130.96893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204130.99021: stdout chunk (state=3): >>>ansible-tmp-1727204130.9589496-37407-194958351488558=/root/.ansible/tmp/ansible-tmp-1727204130.9589496-37407-194958351488558 <<< 34589 1727204130.99199: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204130.99203: stdout chunk (state=3): >>><<< 34589 1727204130.99205: stderr chunk (state=3): >>><<< 34589 1727204130.99488: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204130.9589496-37407-194958351488558=/root/.ansible/tmp/ansible-tmp-1727204130.9589496-37407-194958351488558 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204130.99492: variable 'ansible_module_compression' from source: unknown 34589 1727204130.99495: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 34589 1727204131.00095: variable 'ansible_facts' from source: unknown 34589 1727204131.00502: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204130.9589496-37407-194958351488558/AnsiballZ_package_facts.py 34589 1727204131.00850: Sending initial data 34589 1727204131.00861: Sent initial data (162 bytes) 34589 1727204131.01414: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204131.01493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204131.01539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204131.01556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204131.01580: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204131.01692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204131.03490: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204131.03599: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204131.03700: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmpsxvjee7k /root/.ansible/tmp/ansible-tmp-1727204130.9589496-37407-194958351488558/AnsiballZ_package_facts.py <<< 34589 1727204131.03724: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204130.9589496-37407-194958351488558/AnsiballZ_package_facts.py" <<< 34589 1727204131.03796: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmpsxvjee7k" to remote "/root/.ansible/tmp/ansible-tmp-1727204130.9589496-37407-194958351488558/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204130.9589496-37407-194958351488558/AnsiballZ_package_facts.py" <<< 34589 1727204131.05504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204131.05580: stderr chunk (state=3): >>><<< 34589 1727204131.05599: stdout chunk (state=3): >>><<< 34589 1727204131.05634: done transferring module to remote 34589 1727204131.05652: _low_level_execute_command(): starting 34589 1727204131.05665: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204130.9589496-37407-194958351488558/ /root/.ansible/tmp/ansible-tmp-1727204130.9589496-37407-194958351488558/AnsiballZ_package_facts.py && sleep 0' 34589 1727204131.06469: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204131.06479: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204131.06505: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204131.06615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204131.08638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204131.08642: stdout chunk (state=3): >>><<< 34589 1727204131.08645: stderr chunk (state=3): >>><<< 34589 1727204131.08663: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204131.08680: _low_level_execute_command(): starting 34589 1727204131.08766: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204130.9589496-37407-194958351488558/AnsiballZ_package_facts.py && sleep 0' 34589 1727204131.09318: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204131.09392: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204131.09443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204131.09468: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204131.09502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204131.09620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204131.56627: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 34589 1727204131.56639: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 34589 1727204131.56666: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 34589 1727204131.56695: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 34589 1727204131.56705: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source":<<< 34589 1727204131.56732: stdout chunk (state=3): >>> "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el1<<< 34589 1727204131.56750: stdout chunk (state=3): >>>0", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [<<< 34589 1727204131.56781: stdout chunk (state=3): >>>{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "<<< 34589 1727204131.56798: stdout chunk (state=3): >>>3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch<<< 34589 1727204131.56804: stdout chunk (state=3): >>>": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch",<<< 34589 1727204131.56834: stdout chunk (state=3): >>> "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cl<<< 34589 1727204131.56840: stdout chunk (state=3): >>>oud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 34589 1727204131.59316: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204131.59336: stderr chunk (state=3): >>><<< 34589 1727204131.59340: stdout chunk (state=3): >>><<< 34589 1727204131.59489: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204131.61616: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204130.9589496-37407-194958351488558/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204131.61723: _low_level_execute_command(): starting 34589 1727204131.61727: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204130.9589496-37407-194958351488558/ > /dev/null 2>&1 && sleep 0' 34589 1727204131.62267: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204131.62285: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204131.62301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204131.62318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204131.62334: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204131.62433: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204131.62445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204131.62556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204131.64571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204131.64639: stderr chunk (state=3): >>><<< 34589 1727204131.64647: stdout chunk (state=3): >>><<< 34589 1727204131.64667: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204131.64686: handler run complete 34589 1727204131.65534: variable 'ansible_facts' from source: unknown 34589 1727204131.66093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204131.72748: variable 'ansible_facts' from source: unknown 34589 1727204131.73300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204131.74000: attempt loop complete, returning result 34589 1727204131.74013: _execute() done 34589 1727204131.74016: dumping result to json 34589 1727204131.74224: done dumping result, returning 34589 1727204131.74233: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [028d2410-947f-a9c6-cddc-000000000568] 34589 1727204131.74235: sending task result for task 028d2410-947f-a9c6-cddc-000000000568 34589 1727204131.75983: done sending task result for task 028d2410-947f-a9c6-cddc-000000000568 34589 1727204131.75986: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34589 1727204131.76063: no more pending results, returning what we have 34589 1727204131.76065: results queue empty 34589 1727204131.76065: checking for any_errors_fatal 34589 1727204131.76068: done checking for any_errors_fatal 34589 1727204131.76069: checking for max_fail_percentage 34589 1727204131.76070: done checking for max_fail_percentage 34589 1727204131.76070: checking to see if all hosts have failed and the running result is not ok 34589 1727204131.76071: done checking to see if all hosts have failed 34589 1727204131.76071: getting the remaining hosts for this loop 34589 1727204131.76072: done getting the remaining hosts for this loop 34589 1727204131.76075: getting the next task for host managed-node1 34589 1727204131.76081: done getting next task for host managed-node1 34589 1727204131.76084: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 34589 1727204131.76085: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204131.76091: getting variables 34589 1727204131.76092: in VariableManager get_vars() 34589 1727204131.76114: Calling all_inventory to load vars for managed-node1 34589 1727204131.76116: Calling groups_inventory to load vars for managed-node1 34589 1727204131.76117: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204131.76124: Calling all_plugins_play to load vars for managed-node1 34589 1727204131.76125: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204131.76127: Calling groups_plugins_play to load vars for managed-node1 34589 1727204131.82529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204131.84092: done with get_vars() 34589 1727204131.84123: done getting variables 34589 1727204131.84178: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:55:31 -0400 (0:00:00.936) 0:00:31.976 ***** 34589 1727204131.84206: entering _queue_task() for managed-node1/debug 34589 1727204131.84784: worker is 1 (out of 1 available) 34589 1727204131.84794: exiting _queue_task() for managed-node1/debug 34589 1727204131.84804: done queuing things up, now waiting for results queue to drain 34589 1727204131.84806: waiting for pending results... 34589 1727204131.85295: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 34589 1727204131.85302: in run() - task 028d2410-947f-a9c6-cddc-000000000085 34589 1727204131.85403: variable 'ansible_search_path' from source: unknown 34589 1727204131.85416: variable 'ansible_search_path' from source: unknown 34589 1727204131.85462: calling self._execute() 34589 1727204131.85885: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204131.85890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204131.85894: variable 'omit' from source: magic vars 34589 1727204131.86482: variable 'ansible_distribution_major_version' from source: facts 34589 1727204131.86512: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204131.86523: variable 'omit' from source: magic vars 34589 1727204131.86569: variable 'omit' from source: magic vars 34589 1727204131.86665: variable 'network_provider' from source: set_fact 34589 1727204131.86770: variable 'omit' from source: magic vars 34589 1727204131.86774: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204131.86779: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204131.86797: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204131.86818: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204131.86831: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204131.86859: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204131.86866: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204131.86877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204131.86970: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204131.86982: Set connection var ansible_shell_executable to /bin/sh 34589 1727204131.86998: Set connection var ansible_timeout to 10 34589 1727204131.87004: Set connection var ansible_shell_type to sh 34589 1727204131.87014: Set connection var ansible_connection to ssh 34589 1727204131.87021: Set connection var ansible_pipelining to False 34589 1727204131.87048: variable 'ansible_shell_executable' from source: unknown 34589 1727204131.87056: variable 'ansible_connection' from source: unknown 34589 1727204131.87063: variable 'ansible_module_compression' from source: unknown 34589 1727204131.87068: variable 'ansible_shell_type' from source: unknown 34589 1727204131.87072: variable 'ansible_shell_executable' from source: unknown 34589 1727204131.87098: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204131.87101: variable 'ansible_pipelining' from source: unknown 34589 1727204131.87103: variable 'ansible_timeout' from source: unknown 34589 1727204131.87105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204131.87233: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204131.87316: variable 'omit' from source: magic vars 34589 1727204131.87320: starting attempt loop 34589 1727204131.87322: running the handler 34589 1727204131.87325: handler run complete 34589 1727204131.87335: attempt loop complete, returning result 34589 1727204131.87341: _execute() done 34589 1727204131.87348: dumping result to json 34589 1727204131.87354: done dumping result, returning 34589 1727204131.87364: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [028d2410-947f-a9c6-cddc-000000000085] 34589 1727204131.87373: sending task result for task 028d2410-947f-a9c6-cddc-000000000085 ok: [managed-node1] => {} MSG: Using network provider: nm 34589 1727204131.87636: no more pending results, returning what we have 34589 1727204131.87640: results queue empty 34589 1727204131.87641: checking for any_errors_fatal 34589 1727204131.87653: done checking for any_errors_fatal 34589 1727204131.87654: checking for max_fail_percentage 34589 1727204131.87656: done checking for max_fail_percentage 34589 1727204131.87656: checking to see if all hosts have failed and the running result is not ok 34589 1727204131.87657: done checking to see if all hosts have failed 34589 1727204131.87658: getting the remaining hosts for this loop 34589 1727204131.87659: done getting the remaining hosts for this loop 34589 1727204131.87663: getting the next task for host managed-node1 34589 1727204131.87670: done getting next task for host managed-node1 34589 1727204131.87674: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34589 1727204131.87678: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204131.87688: getting variables 34589 1727204131.87690: in VariableManager get_vars() 34589 1727204131.87728: Calling all_inventory to load vars for managed-node1 34589 1727204131.87731: Calling groups_inventory to load vars for managed-node1 34589 1727204131.87733: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204131.87744: Calling all_plugins_play to load vars for managed-node1 34589 1727204131.87747: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204131.87750: Calling groups_plugins_play to load vars for managed-node1 34589 1727204131.88317: done sending task result for task 028d2410-947f-a9c6-cddc-000000000085 34589 1727204131.88320: WORKER PROCESS EXITING 34589 1727204131.89260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204131.90772: done with get_vars() 34589 1727204131.90798: done getting variables 34589 1727204131.90854: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:55:31 -0400 (0:00:00.066) 0:00:32.043 ***** 34589 1727204131.90889: entering _queue_task() for managed-node1/fail 34589 1727204131.91193: worker is 1 (out of 1 available) 34589 1727204131.91204: exiting _queue_task() for managed-node1/fail 34589 1727204131.91215: done queuing things up, now waiting for results queue to drain 34589 1727204131.91217: waiting for pending results... 34589 1727204131.91493: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34589 1727204131.91616: in run() - task 028d2410-947f-a9c6-cddc-000000000086 34589 1727204131.91635: variable 'ansible_search_path' from source: unknown 34589 1727204131.91642: variable 'ansible_search_path' from source: unknown 34589 1727204131.91682: calling self._execute() 34589 1727204131.91790: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204131.91802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204131.91817: variable 'omit' from source: magic vars 34589 1727204131.92188: variable 'ansible_distribution_major_version' from source: facts 34589 1727204131.92203: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204131.92327: variable 'network_state' from source: role '' defaults 34589 1727204131.92342: Evaluated conditional (network_state != {}): False 34589 1727204131.92350: when evaluation is False, skipping this task 34589 1727204131.92356: _execute() done 34589 1727204131.92365: dumping result to json 34589 1727204131.92377: done dumping result, returning 34589 1727204131.92389: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [028d2410-947f-a9c6-cddc-000000000086] 34589 1727204131.92398: sending task result for task 028d2410-947f-a9c6-cddc-000000000086 34589 1727204131.92543: done sending task result for task 028d2410-947f-a9c6-cddc-000000000086 34589 1727204131.92546: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34589 1727204131.92631: no more pending results, returning what we have 34589 1727204131.92634: results queue empty 34589 1727204131.92635: checking for any_errors_fatal 34589 1727204131.92642: done checking for any_errors_fatal 34589 1727204131.92642: checking for max_fail_percentage 34589 1727204131.92644: done checking for max_fail_percentage 34589 1727204131.92645: checking to see if all hosts have failed and the running result is not ok 34589 1727204131.92646: done checking to see if all hosts have failed 34589 1727204131.92646: getting the remaining hosts for this loop 34589 1727204131.92648: done getting the remaining hosts for this loop 34589 1727204131.92652: getting the next task for host managed-node1 34589 1727204131.92659: done getting next task for host managed-node1 34589 1727204131.92663: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34589 1727204131.92665: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204131.92681: getting variables 34589 1727204131.92683: in VariableManager get_vars() 34589 1727204131.92720: Calling all_inventory to load vars for managed-node1 34589 1727204131.92723: Calling groups_inventory to load vars for managed-node1 34589 1727204131.92725: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204131.92737: Calling all_plugins_play to load vars for managed-node1 34589 1727204131.92740: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204131.92743: Calling groups_plugins_play to load vars for managed-node1 34589 1727204131.94269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204131.95807: done with get_vars() 34589 1727204131.95830: done getting variables 34589 1727204131.95890: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:55:31 -0400 (0:00:00.050) 0:00:32.094 ***** 34589 1727204131.95923: entering _queue_task() for managed-node1/fail 34589 1727204131.96223: worker is 1 (out of 1 available) 34589 1727204131.96235: exiting _queue_task() for managed-node1/fail 34589 1727204131.96246: done queuing things up, now waiting for results queue to drain 34589 1727204131.96248: waiting for pending results... 34589 1727204131.96602: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34589 1727204131.96644: in run() - task 028d2410-947f-a9c6-cddc-000000000087 34589 1727204131.96663: variable 'ansible_search_path' from source: unknown 34589 1727204131.96670: variable 'ansible_search_path' from source: unknown 34589 1727204131.96718: calling self._execute() 34589 1727204131.96827: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204131.96840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204131.96852: variable 'omit' from source: magic vars 34589 1727204131.97239: variable 'ansible_distribution_major_version' from source: facts 34589 1727204131.97248: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204131.97584: variable 'network_state' from source: role '' defaults 34589 1727204131.97588: Evaluated conditional (network_state != {}): False 34589 1727204131.97591: when evaluation is False, skipping this task 34589 1727204131.97593: _execute() done 34589 1727204131.97595: dumping result to json 34589 1727204131.97597: done dumping result, returning 34589 1727204131.97600: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [028d2410-947f-a9c6-cddc-000000000087] 34589 1727204131.97602: sending task result for task 028d2410-947f-a9c6-cddc-000000000087 34589 1727204131.97667: done sending task result for task 028d2410-947f-a9c6-cddc-000000000087 34589 1727204131.97669: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34589 1727204131.97714: no more pending results, returning what we have 34589 1727204131.97717: results queue empty 34589 1727204131.97718: checking for any_errors_fatal 34589 1727204131.97725: done checking for any_errors_fatal 34589 1727204131.97726: checking for max_fail_percentage 34589 1727204131.97727: done checking for max_fail_percentage 34589 1727204131.97728: checking to see if all hosts have failed and the running result is not ok 34589 1727204131.97729: done checking to see if all hosts have failed 34589 1727204131.97729: getting the remaining hosts for this loop 34589 1727204131.97731: done getting the remaining hosts for this loop 34589 1727204131.97734: getting the next task for host managed-node1 34589 1727204131.97739: done getting next task for host managed-node1 34589 1727204131.97744: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34589 1727204131.97746: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204131.97760: getting variables 34589 1727204131.97761: in VariableManager get_vars() 34589 1727204131.97798: Calling all_inventory to load vars for managed-node1 34589 1727204131.97801: Calling groups_inventory to load vars for managed-node1 34589 1727204131.97803: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204131.97814: Calling all_plugins_play to load vars for managed-node1 34589 1727204131.97817: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204131.97820: Calling groups_plugins_play to load vars for managed-node1 34589 1727204131.99190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204132.00866: done with get_vars() 34589 1727204132.00888: done getting variables 34589 1727204132.00941: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:55:32 -0400 (0:00:00.050) 0:00:32.144 ***** 34589 1727204132.00970: entering _queue_task() for managed-node1/fail 34589 1727204132.01440: worker is 1 (out of 1 available) 34589 1727204132.01453: exiting _queue_task() for managed-node1/fail 34589 1727204132.01465: done queuing things up, now waiting for results queue to drain 34589 1727204132.01466: waiting for pending results... 34589 1727204132.01754: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34589 1727204132.01879: in run() - task 028d2410-947f-a9c6-cddc-000000000088 34589 1727204132.01906: variable 'ansible_search_path' from source: unknown 34589 1727204132.02080: variable 'ansible_search_path' from source: unknown 34589 1727204132.02085: calling self._execute() 34589 1727204132.02087: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204132.02090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204132.02093: variable 'omit' from source: magic vars 34589 1727204132.02483: variable 'ansible_distribution_major_version' from source: facts 34589 1727204132.02502: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204132.02689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204132.05150: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204132.05234: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204132.05273: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204132.05317: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204132.05350: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204132.05439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204132.05477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204132.05514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.05561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204132.05586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204132.05689: variable 'ansible_distribution_major_version' from source: facts 34589 1727204132.05713: Evaluated conditional (ansible_distribution_major_version | int > 9): True 34589 1727204132.05838: variable 'ansible_distribution' from source: facts 34589 1727204132.05847: variable '__network_rh_distros' from source: role '' defaults 34589 1727204132.06080: Evaluated conditional (ansible_distribution in __network_rh_distros): True 34589 1727204132.06119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204132.06149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204132.06182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.06231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204132.06249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204132.06305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204132.06336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204132.06367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.06418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204132.06437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204132.06482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204132.06628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204132.06632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.06634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204132.06636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204132.07050: variable 'network_connections' from source: play vars 34589 1727204132.07123: variable 'profile' from source: play vars 34589 1727204132.07202: variable 'profile' from source: play vars 34589 1727204132.07213: variable 'interface' from source: set_fact 34589 1727204132.07282: variable 'interface' from source: set_fact 34589 1727204132.07299: variable 'network_state' from source: role '' defaults 34589 1727204132.07360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204132.07543: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204132.07583: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204132.07626: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204132.07663: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204132.07719: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204132.07756: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204132.07789: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.07816: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204132.07848: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 34589 1727204132.08043: when evaluation is False, skipping this task 34589 1727204132.08046: _execute() done 34589 1727204132.08048: dumping result to json 34589 1727204132.08051: done dumping result, returning 34589 1727204132.08053: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [028d2410-947f-a9c6-cddc-000000000088] 34589 1727204132.08055: sending task result for task 028d2410-947f-a9c6-cddc-000000000088 34589 1727204132.08128: done sending task result for task 028d2410-947f-a9c6-cddc-000000000088 34589 1727204132.08132: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 34589 1727204132.08197: no more pending results, returning what we have 34589 1727204132.08201: results queue empty 34589 1727204132.08202: checking for any_errors_fatal 34589 1727204132.08210: done checking for any_errors_fatal 34589 1727204132.08211: checking for max_fail_percentage 34589 1727204132.08212: done checking for max_fail_percentage 34589 1727204132.08213: checking to see if all hosts have failed and the running result is not ok 34589 1727204132.08214: done checking to see if all hosts have failed 34589 1727204132.08215: getting the remaining hosts for this loop 34589 1727204132.08216: done getting the remaining hosts for this loop 34589 1727204132.08221: getting the next task for host managed-node1 34589 1727204132.08228: done getting next task for host managed-node1 34589 1727204132.08232: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34589 1727204132.08235: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204132.08248: getting variables 34589 1727204132.08250: in VariableManager get_vars() 34589 1727204132.08292: Calling all_inventory to load vars for managed-node1 34589 1727204132.08296: Calling groups_inventory to load vars for managed-node1 34589 1727204132.08298: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204132.08311: Calling all_plugins_play to load vars for managed-node1 34589 1727204132.08315: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204132.08318: Calling groups_plugins_play to load vars for managed-node1 34589 1727204132.10218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204132.11941: done with get_vars() 34589 1727204132.11966: done getting variables 34589 1727204132.12043: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:55:32 -0400 (0:00:00.111) 0:00:32.255 ***** 34589 1727204132.12073: entering _queue_task() for managed-node1/dnf 34589 1727204132.12452: worker is 1 (out of 1 available) 34589 1727204132.12463: exiting _queue_task() for managed-node1/dnf 34589 1727204132.12479: done queuing things up, now waiting for results queue to drain 34589 1727204132.12480: waiting for pending results... 34589 1727204132.12794: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34589 1727204132.12881: in run() - task 028d2410-947f-a9c6-cddc-000000000089 34589 1727204132.13080: variable 'ansible_search_path' from source: unknown 34589 1727204132.13084: variable 'ansible_search_path' from source: unknown 34589 1727204132.13087: calling self._execute() 34589 1727204132.13090: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204132.13093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204132.13095: variable 'omit' from source: magic vars 34589 1727204132.13453: variable 'ansible_distribution_major_version' from source: facts 34589 1727204132.13470: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204132.13667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204132.15853: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204132.16249: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204132.16294: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204132.16334: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204132.16364: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204132.16446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204132.16482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204132.16513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.16558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204132.16578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204132.16693: variable 'ansible_distribution' from source: facts 34589 1727204132.16704: variable 'ansible_distribution_major_version' from source: facts 34589 1727204132.16724: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 34589 1727204132.16838: variable '__network_wireless_connections_defined' from source: role '' defaults 34589 1727204132.16970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204132.17001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204132.17029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.17073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204132.17095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204132.17140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204132.17168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204132.17198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.17241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204132.17261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204132.17481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204132.17485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204132.17487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.17490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204132.17492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204132.17565: variable 'network_connections' from source: play vars 34589 1727204132.17584: variable 'profile' from source: play vars 34589 1727204132.17657: variable 'profile' from source: play vars 34589 1727204132.17666: variable 'interface' from source: set_fact 34589 1727204132.17733: variable 'interface' from source: set_fact 34589 1727204132.17810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204132.17987: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204132.18028: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204132.18068: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204132.18118: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204132.18168: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204132.18197: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204132.18235: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.18270: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204132.18322: variable '__network_team_connections_defined' from source: role '' defaults 34589 1727204132.18564: variable 'network_connections' from source: play vars 34589 1727204132.18582: variable 'profile' from source: play vars 34589 1727204132.18643: variable 'profile' from source: play vars 34589 1727204132.18652: variable 'interface' from source: set_fact 34589 1727204132.18717: variable 'interface' from source: set_fact 34589 1727204132.18796: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 34589 1727204132.18799: when evaluation is False, skipping this task 34589 1727204132.18801: _execute() done 34589 1727204132.18803: dumping result to json 34589 1727204132.18805: done dumping result, returning 34589 1727204132.18807: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [028d2410-947f-a9c6-cddc-000000000089] 34589 1727204132.18809: sending task result for task 028d2410-947f-a9c6-cddc-000000000089 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 34589 1727204132.18954: no more pending results, returning what we have 34589 1727204132.18959: results queue empty 34589 1727204132.18960: checking for any_errors_fatal 34589 1727204132.18966: done checking for any_errors_fatal 34589 1727204132.18967: checking for max_fail_percentage 34589 1727204132.18968: done checking for max_fail_percentage 34589 1727204132.18969: checking to see if all hosts have failed and the running result is not ok 34589 1727204132.18970: done checking to see if all hosts have failed 34589 1727204132.18971: getting the remaining hosts for this loop 34589 1727204132.18973: done getting the remaining hosts for this loop 34589 1727204132.18979: getting the next task for host managed-node1 34589 1727204132.18986: done getting next task for host managed-node1 34589 1727204132.18991: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34589 1727204132.18993: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204132.19008: getting variables 34589 1727204132.19010: in VariableManager get_vars() 34589 1727204132.19049: Calling all_inventory to load vars for managed-node1 34589 1727204132.19052: Calling groups_inventory to load vars for managed-node1 34589 1727204132.19055: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204132.19065: Calling all_plugins_play to load vars for managed-node1 34589 1727204132.19068: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204132.19070: Calling groups_plugins_play to load vars for managed-node1 34589 1727204132.19989: done sending task result for task 028d2410-947f-a9c6-cddc-000000000089 34589 1727204132.19992: WORKER PROCESS EXITING 34589 1727204132.20920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204132.22058: done with get_vars() 34589 1727204132.22078: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34589 1727204132.22134: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:55:32 -0400 (0:00:00.100) 0:00:32.356 ***** 34589 1727204132.22156: entering _queue_task() for managed-node1/yum 34589 1727204132.22421: worker is 1 (out of 1 available) 34589 1727204132.22433: exiting _queue_task() for managed-node1/yum 34589 1727204132.22446: done queuing things up, now waiting for results queue to drain 34589 1727204132.22447: waiting for pending results... 34589 1727204132.22631: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34589 1727204132.22711: in run() - task 028d2410-947f-a9c6-cddc-00000000008a 34589 1727204132.22725: variable 'ansible_search_path' from source: unknown 34589 1727204132.22729: variable 'ansible_search_path' from source: unknown 34589 1727204132.22759: calling self._execute() 34589 1727204132.22840: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204132.22844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204132.22852: variable 'omit' from source: magic vars 34589 1727204132.23144: variable 'ansible_distribution_major_version' from source: facts 34589 1727204132.23153: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204132.23382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204132.25178: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204132.25233: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204132.25262: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204132.25288: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204132.25312: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204132.25369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204132.25391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204132.25411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.25439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204132.25450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204132.25523: variable 'ansible_distribution_major_version' from source: facts 34589 1727204132.25539: Evaluated conditional (ansible_distribution_major_version | int < 8): False 34589 1727204132.25543: when evaluation is False, skipping this task 34589 1727204132.25545: _execute() done 34589 1727204132.25547: dumping result to json 34589 1727204132.25549: done dumping result, returning 34589 1727204132.25557: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [028d2410-947f-a9c6-cddc-00000000008a] 34589 1727204132.25560: sending task result for task 028d2410-947f-a9c6-cddc-00000000008a 34589 1727204132.25652: done sending task result for task 028d2410-947f-a9c6-cddc-00000000008a 34589 1727204132.25654: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 34589 1727204132.25711: no more pending results, returning what we have 34589 1727204132.25714: results queue empty 34589 1727204132.25715: checking for any_errors_fatal 34589 1727204132.25720: done checking for any_errors_fatal 34589 1727204132.25721: checking for max_fail_percentage 34589 1727204132.25724: done checking for max_fail_percentage 34589 1727204132.25725: checking to see if all hosts have failed and the running result is not ok 34589 1727204132.25726: done checking to see if all hosts have failed 34589 1727204132.25726: getting the remaining hosts for this loop 34589 1727204132.25728: done getting the remaining hosts for this loop 34589 1727204132.25731: getting the next task for host managed-node1 34589 1727204132.25737: done getting next task for host managed-node1 34589 1727204132.25740: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34589 1727204132.25742: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204132.25754: getting variables 34589 1727204132.25756: in VariableManager get_vars() 34589 1727204132.25796: Calling all_inventory to load vars for managed-node1 34589 1727204132.25799: Calling groups_inventory to load vars for managed-node1 34589 1727204132.25801: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204132.25812: Calling all_plugins_play to load vars for managed-node1 34589 1727204132.25815: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204132.25818: Calling groups_plugins_play to load vars for managed-node1 34589 1727204132.26986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204132.28717: done with get_vars() 34589 1727204132.28745: done getting variables 34589 1727204132.28813: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:55:32 -0400 (0:00:00.066) 0:00:32.423 ***** 34589 1727204132.28844: entering _queue_task() for managed-node1/fail 34589 1727204132.29229: worker is 1 (out of 1 available) 34589 1727204132.29242: exiting _queue_task() for managed-node1/fail 34589 1727204132.29255: done queuing things up, now waiting for results queue to drain 34589 1727204132.29256: waiting for pending results... 34589 1727204132.29796: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34589 1727204132.29803: in run() - task 028d2410-947f-a9c6-cddc-00000000008b 34589 1727204132.29810: variable 'ansible_search_path' from source: unknown 34589 1727204132.29814: variable 'ansible_search_path' from source: unknown 34589 1727204132.29822: calling self._execute() 34589 1727204132.29825: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204132.29833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204132.29843: variable 'omit' from source: magic vars 34589 1727204132.30251: variable 'ansible_distribution_major_version' from source: facts 34589 1727204132.30267: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204132.30396: variable '__network_wireless_connections_defined' from source: role '' defaults 34589 1727204132.30614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204132.33878: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204132.33956: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204132.34198: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204132.34240: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204132.34265: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204132.34395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204132.34502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204132.34562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.34738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204132.34754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204132.34810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204132.34830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204132.34970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.35014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204132.35029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204132.35191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204132.35214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204132.35238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.35393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204132.35412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204132.35780: variable 'network_connections' from source: play vars 34589 1727204132.35832: variable 'profile' from source: play vars 34589 1727204132.35959: variable 'profile' from source: play vars 34589 1727204132.35963: variable 'interface' from source: set_fact 34589 1727204132.36026: variable 'interface' from source: set_fact 34589 1727204132.36284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204132.36569: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204132.36719: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204132.36817: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204132.36861: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204132.36909: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204132.36928: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204132.37072: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.37180: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204132.37183: variable '__network_team_connections_defined' from source: role '' defaults 34589 1727204132.37789: variable 'network_connections' from source: play vars 34589 1727204132.37792: variable 'profile' from source: play vars 34589 1727204132.37866: variable 'profile' from source: play vars 34589 1727204132.37869: variable 'interface' from source: set_fact 34589 1727204132.38050: variable 'interface' from source: set_fact 34589 1727204132.38075: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 34589 1727204132.38080: when evaluation is False, skipping this task 34589 1727204132.38139: _execute() done 34589 1727204132.38142: dumping result to json 34589 1727204132.38152: done dumping result, returning 34589 1727204132.38161: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [028d2410-947f-a9c6-cddc-00000000008b] 34589 1727204132.38171: sending task result for task 028d2410-947f-a9c6-cddc-00000000008b 34589 1727204132.38456: done sending task result for task 028d2410-947f-a9c6-cddc-00000000008b 34589 1727204132.38460: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 34589 1727204132.38525: no more pending results, returning what we have 34589 1727204132.38529: results queue empty 34589 1727204132.38530: checking for any_errors_fatal 34589 1727204132.38538: done checking for any_errors_fatal 34589 1727204132.38539: checking for max_fail_percentage 34589 1727204132.38541: done checking for max_fail_percentage 34589 1727204132.38542: checking to see if all hosts have failed and the running result is not ok 34589 1727204132.38543: done checking to see if all hosts have failed 34589 1727204132.38543: getting the remaining hosts for this loop 34589 1727204132.38545: done getting the remaining hosts for this loop 34589 1727204132.38549: getting the next task for host managed-node1 34589 1727204132.38556: done getting next task for host managed-node1 34589 1727204132.38561: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 34589 1727204132.38563: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204132.38578: getting variables 34589 1727204132.38580: in VariableManager get_vars() 34589 1727204132.38642: Calling all_inventory to load vars for managed-node1 34589 1727204132.38646: Calling groups_inventory to load vars for managed-node1 34589 1727204132.38648: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204132.38658: Calling all_plugins_play to load vars for managed-node1 34589 1727204132.38661: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204132.38664: Calling groups_plugins_play to load vars for managed-node1 34589 1727204132.40635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204132.42916: done with get_vars() 34589 1727204132.42942: done getting variables 34589 1727204132.43126: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:55:32 -0400 (0:00:00.143) 0:00:32.566 ***** 34589 1727204132.43159: entering _queue_task() for managed-node1/package 34589 1727204132.43974: worker is 1 (out of 1 available) 34589 1727204132.44058: exiting _queue_task() for managed-node1/package 34589 1727204132.44070: done queuing things up, now waiting for results queue to drain 34589 1727204132.44071: waiting for pending results... 34589 1727204132.44396: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 34589 1727204132.44785: in run() - task 028d2410-947f-a9c6-cddc-00000000008c 34589 1727204132.44789: variable 'ansible_search_path' from source: unknown 34589 1727204132.44792: variable 'ansible_search_path' from source: unknown 34589 1727204132.44795: calling self._execute() 34589 1727204132.45095: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204132.45100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204132.45115: variable 'omit' from source: magic vars 34589 1727204132.45925: variable 'ansible_distribution_major_version' from source: facts 34589 1727204132.45936: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204132.46325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204132.46808: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204132.46857: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204132.47102: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204132.47211: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204132.47280: variable 'network_packages' from source: role '' defaults 34589 1727204132.47588: variable '__network_provider_setup' from source: role '' defaults 34589 1727204132.47599: variable '__network_service_name_default_nm' from source: role '' defaults 34589 1727204132.47667: variable '__network_service_name_default_nm' from source: role '' defaults 34589 1727204132.47676: variable '__network_packages_default_nm' from source: role '' defaults 34589 1727204132.47940: variable '__network_packages_default_nm' from source: role '' defaults 34589 1727204132.48325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204132.52432: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204132.52467: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204132.52696: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204132.52757: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204132.52762: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204132.52843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204132.52970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204132.53101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.53143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204132.53156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204132.53201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204132.53226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204132.53249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.53691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204132.53732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204132.54000: variable '__network_packages_default_gobject_packages' from source: role '' defaults 34589 1727204132.54271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204132.54276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204132.54279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.54282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204132.54284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204132.54402: variable 'ansible_python' from source: facts 34589 1727204132.54406: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 34589 1727204132.54427: variable '__network_wpa_supplicant_required' from source: role '' defaults 34589 1727204132.54636: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 34589 1727204132.54988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204132.54991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204132.54994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.54996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204132.54998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204132.55000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204132.55011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204132.55014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.55016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204132.55093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204132.55170: variable 'network_connections' from source: play vars 34589 1727204132.55178: variable 'profile' from source: play vars 34589 1727204132.55312: variable 'profile' from source: play vars 34589 1727204132.55316: variable 'interface' from source: set_fact 34589 1727204132.55528: variable 'interface' from source: set_fact 34589 1727204132.55534: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204132.55536: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204132.55539: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.55541: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204132.55637: variable '__network_wireless_connections_defined' from source: role '' defaults 34589 1727204132.55894: variable 'network_connections' from source: play vars 34589 1727204132.55898: variable 'profile' from source: play vars 34589 1727204132.56019: variable 'profile' from source: play vars 34589 1727204132.56079: variable 'interface' from source: set_fact 34589 1727204132.56124: variable 'interface' from source: set_fact 34589 1727204132.56156: variable '__network_packages_default_wireless' from source: role '' defaults 34589 1727204132.56527: variable '__network_wireless_connections_defined' from source: role '' defaults 34589 1727204132.56954: variable 'network_connections' from source: play vars 34589 1727204132.56963: variable 'profile' from source: play vars 34589 1727204132.57035: variable 'profile' from source: play vars 34589 1727204132.57044: variable 'interface' from source: set_fact 34589 1727204132.57303: variable 'interface' from source: set_fact 34589 1727204132.57331: variable '__network_packages_default_team' from source: role '' defaults 34589 1727204132.57407: variable '__network_team_connections_defined' from source: role '' defaults 34589 1727204132.57706: variable 'network_connections' from source: play vars 34589 1727204132.57719: variable 'profile' from source: play vars 34589 1727204132.57777: variable 'profile' from source: play vars 34589 1727204132.57780: variable 'interface' from source: set_fact 34589 1727204132.57943: variable 'interface' from source: set_fact 34589 1727204132.58047: variable '__network_service_name_default_initscripts' from source: role '' defaults 34589 1727204132.58060: variable '__network_service_name_default_initscripts' from source: role '' defaults 34589 1727204132.58067: variable '__network_packages_default_initscripts' from source: role '' defaults 34589 1727204132.58234: variable '__network_packages_default_initscripts' from source: role '' defaults 34589 1727204132.58668: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 34589 1727204132.59709: variable 'network_connections' from source: play vars 34589 1727204132.59717: variable 'profile' from source: play vars 34589 1727204132.59798: variable 'profile' from source: play vars 34589 1727204132.59805: variable 'interface' from source: set_fact 34589 1727204132.59844: variable 'interface' from source: set_fact 34589 1727204132.59880: variable 'ansible_distribution' from source: facts 34589 1727204132.59883: variable '__network_rh_distros' from source: role '' defaults 34589 1727204132.59886: variable 'ansible_distribution_major_version' from source: facts 34589 1727204132.59888: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 34589 1727204132.60186: variable 'ansible_distribution' from source: facts 34589 1727204132.60189: variable '__network_rh_distros' from source: role '' defaults 34589 1727204132.60191: variable 'ansible_distribution_major_version' from source: facts 34589 1727204132.60193: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 34589 1727204132.60471: variable 'ansible_distribution' from source: facts 34589 1727204132.60476: variable '__network_rh_distros' from source: role '' defaults 34589 1727204132.60481: variable 'ansible_distribution_major_version' from source: facts 34589 1727204132.60519: variable 'network_provider' from source: set_fact 34589 1727204132.60534: variable 'ansible_facts' from source: unknown 34589 1727204132.61843: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 34589 1727204132.61847: when evaluation is False, skipping this task 34589 1727204132.61849: _execute() done 34589 1727204132.61852: dumping result to json 34589 1727204132.61854: done dumping result, returning 34589 1727204132.61862: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [028d2410-947f-a9c6-cddc-00000000008c] 34589 1727204132.61865: sending task result for task 028d2410-947f-a9c6-cddc-00000000008c skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 34589 1727204132.62128: no more pending results, returning what we have 34589 1727204132.62131: results queue empty 34589 1727204132.62132: checking for any_errors_fatal 34589 1727204132.62138: done checking for any_errors_fatal 34589 1727204132.62139: checking for max_fail_percentage 34589 1727204132.62141: done checking for max_fail_percentage 34589 1727204132.62141: checking to see if all hosts have failed and the running result is not ok 34589 1727204132.62142: done checking to see if all hosts have failed 34589 1727204132.62143: getting the remaining hosts for this loop 34589 1727204132.62144: done getting the remaining hosts for this loop 34589 1727204132.62147: getting the next task for host managed-node1 34589 1727204132.62151: done getting next task for host managed-node1 34589 1727204132.62155: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34589 1727204132.62157: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204132.62170: getting variables 34589 1727204132.62171: in VariableManager get_vars() 34589 1727204132.62205: Calling all_inventory to load vars for managed-node1 34589 1727204132.62210: Calling groups_inventory to load vars for managed-node1 34589 1727204132.62212: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204132.62220: Calling all_plugins_play to load vars for managed-node1 34589 1727204132.62227: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204132.62230: Calling groups_plugins_play to load vars for managed-node1 34589 1727204132.62777: done sending task result for task 028d2410-947f-a9c6-cddc-00000000008c 34589 1727204132.62782: WORKER PROCESS EXITING 34589 1727204132.63938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204132.65913: done with get_vars() 34589 1727204132.65935: done getting variables 34589 1727204132.66001: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:55:32 -0400 (0:00:00.228) 0:00:32.795 ***** 34589 1727204132.66033: entering _queue_task() for managed-node1/package 34589 1727204132.66491: worker is 1 (out of 1 available) 34589 1727204132.66505: exiting _queue_task() for managed-node1/package 34589 1727204132.66518: done queuing things up, now waiting for results queue to drain 34589 1727204132.66519: waiting for pending results... 34589 1727204132.66790: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34589 1727204132.66862: in run() - task 028d2410-947f-a9c6-cddc-00000000008d 34589 1727204132.66887: variable 'ansible_search_path' from source: unknown 34589 1727204132.66910: variable 'ansible_search_path' from source: unknown 34589 1727204132.67017: calling self._execute() 34589 1727204132.67081: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204132.67095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204132.67114: variable 'omit' from source: magic vars 34589 1727204132.67544: variable 'ansible_distribution_major_version' from source: facts 34589 1727204132.67568: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204132.67711: variable 'network_state' from source: role '' defaults 34589 1727204132.67729: Evaluated conditional (network_state != {}): False 34589 1727204132.67738: when evaluation is False, skipping this task 34589 1727204132.67745: _execute() done 34589 1727204132.67780: dumping result to json 34589 1727204132.67784: done dumping result, returning 34589 1727204132.67788: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [028d2410-947f-a9c6-cddc-00000000008d] 34589 1727204132.67790: sending task result for task 028d2410-947f-a9c6-cddc-00000000008d skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34589 1727204132.68030: no more pending results, returning what we have 34589 1727204132.68035: results queue empty 34589 1727204132.68036: checking for any_errors_fatal 34589 1727204132.68044: done checking for any_errors_fatal 34589 1727204132.68045: checking for max_fail_percentage 34589 1727204132.68047: done checking for max_fail_percentage 34589 1727204132.68048: checking to see if all hosts have failed and the running result is not ok 34589 1727204132.68049: done checking to see if all hosts have failed 34589 1727204132.68050: getting the remaining hosts for this loop 34589 1727204132.68051: done getting the remaining hosts for this loop 34589 1727204132.68056: getting the next task for host managed-node1 34589 1727204132.68063: done getting next task for host managed-node1 34589 1727204132.68067: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34589 1727204132.68070: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204132.68087: getting variables 34589 1727204132.68089: in VariableManager get_vars() 34589 1727204132.68133: Calling all_inventory to load vars for managed-node1 34589 1727204132.68137: Calling groups_inventory to load vars for managed-node1 34589 1727204132.68139: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204132.68152: Calling all_plugins_play to load vars for managed-node1 34589 1727204132.68155: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204132.68158: Calling groups_plugins_play to load vars for managed-node1 34589 1727204132.68691: done sending task result for task 028d2410-947f-a9c6-cddc-00000000008d 34589 1727204132.68695: WORKER PROCESS EXITING 34589 1727204132.69820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204132.71459: done with get_vars() 34589 1727204132.71486: done getting variables 34589 1727204132.71555: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:55:32 -0400 (0:00:00.055) 0:00:32.850 ***** 34589 1727204132.71587: entering _queue_task() for managed-node1/package 34589 1727204132.71961: worker is 1 (out of 1 available) 34589 1727204132.71974: exiting _queue_task() for managed-node1/package 34589 1727204132.72092: done queuing things up, now waiting for results queue to drain 34589 1727204132.72093: waiting for pending results... 34589 1727204132.72277: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34589 1727204132.72401: in run() - task 028d2410-947f-a9c6-cddc-00000000008e 34589 1727204132.72429: variable 'ansible_search_path' from source: unknown 34589 1727204132.72439: variable 'ansible_search_path' from source: unknown 34589 1727204132.72481: calling self._execute() 34589 1727204132.72590: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204132.72601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204132.72620: variable 'omit' from source: magic vars 34589 1727204132.73015: variable 'ansible_distribution_major_version' from source: facts 34589 1727204132.73032: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204132.73171: variable 'network_state' from source: role '' defaults 34589 1727204132.73380: Evaluated conditional (network_state != {}): False 34589 1727204132.73384: when evaluation is False, skipping this task 34589 1727204132.73386: _execute() done 34589 1727204132.73389: dumping result to json 34589 1727204132.73391: done dumping result, returning 34589 1727204132.73394: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [028d2410-947f-a9c6-cddc-00000000008e] 34589 1727204132.73396: sending task result for task 028d2410-947f-a9c6-cddc-00000000008e 34589 1727204132.73467: done sending task result for task 028d2410-947f-a9c6-cddc-00000000008e 34589 1727204132.73470: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34589 1727204132.73527: no more pending results, returning what we have 34589 1727204132.73532: results queue empty 34589 1727204132.73533: checking for any_errors_fatal 34589 1727204132.73540: done checking for any_errors_fatal 34589 1727204132.73541: checking for max_fail_percentage 34589 1727204132.73543: done checking for max_fail_percentage 34589 1727204132.73544: checking to see if all hosts have failed and the running result is not ok 34589 1727204132.73545: done checking to see if all hosts have failed 34589 1727204132.73545: getting the remaining hosts for this loop 34589 1727204132.73547: done getting the remaining hosts for this loop 34589 1727204132.73551: getting the next task for host managed-node1 34589 1727204132.73557: done getting next task for host managed-node1 34589 1727204132.73561: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34589 1727204132.73564: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204132.73579: getting variables 34589 1727204132.73581: in VariableManager get_vars() 34589 1727204132.73621: Calling all_inventory to load vars for managed-node1 34589 1727204132.73624: Calling groups_inventory to load vars for managed-node1 34589 1727204132.73626: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204132.73639: Calling all_plugins_play to load vars for managed-node1 34589 1727204132.73642: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204132.73645: Calling groups_plugins_play to load vars for managed-node1 34589 1727204132.75397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204132.77048: done with get_vars() 34589 1727204132.77081: done getting variables 34589 1727204132.77151: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:55:32 -0400 (0:00:00.055) 0:00:32.906 ***** 34589 1727204132.77186: entering _queue_task() for managed-node1/service 34589 1727204132.77565: worker is 1 (out of 1 available) 34589 1727204132.77582: exiting _queue_task() for managed-node1/service 34589 1727204132.77689: done queuing things up, now waiting for results queue to drain 34589 1727204132.77691: waiting for pending results... 34589 1727204132.77994: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34589 1727204132.78039: in run() - task 028d2410-947f-a9c6-cddc-00000000008f 34589 1727204132.78060: variable 'ansible_search_path' from source: unknown 34589 1727204132.78066: variable 'ansible_search_path' from source: unknown 34589 1727204132.78106: calling self._execute() 34589 1727204132.78218: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204132.78243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204132.78246: variable 'omit' from source: magic vars 34589 1727204132.78627: variable 'ansible_distribution_major_version' from source: facts 34589 1727204132.78680: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204132.78766: variable '__network_wireless_connections_defined' from source: role '' defaults 34589 1727204132.78982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204132.81300: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204132.81480: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204132.81485: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204132.81489: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204132.81514: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204132.81601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204132.81681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204132.81684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.81726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204132.81745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204132.81796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204132.81880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204132.81883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.81904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204132.81925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204132.81980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204132.82044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204132.82048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.82084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204132.82102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204132.82295: variable 'network_connections' from source: play vars 34589 1727204132.82316: variable 'profile' from source: play vars 34589 1727204132.82403: variable 'profile' from source: play vars 34589 1727204132.82415: variable 'interface' from source: set_fact 34589 1727204132.82680: variable 'interface' from source: set_fact 34589 1727204132.82684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204132.82755: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204132.82806: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204132.82843: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204132.82874: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204132.82931: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204132.82956: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204132.82986: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.83026: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204132.83079: variable '__network_team_connections_defined' from source: role '' defaults 34589 1727204132.83347: variable 'network_connections' from source: play vars 34589 1727204132.83358: variable 'profile' from source: play vars 34589 1727204132.83424: variable 'profile' from source: play vars 34589 1727204132.83433: variable 'interface' from source: set_fact 34589 1727204132.83501: variable 'interface' from source: set_fact 34589 1727204132.83530: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 34589 1727204132.83536: when evaluation is False, skipping this task 34589 1727204132.83542: _execute() done 34589 1727204132.83548: dumping result to json 34589 1727204132.83553: done dumping result, returning 34589 1727204132.83570: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [028d2410-947f-a9c6-cddc-00000000008f] 34589 1727204132.83678: sending task result for task 028d2410-947f-a9c6-cddc-00000000008f 34589 1727204132.83753: done sending task result for task 028d2410-947f-a9c6-cddc-00000000008f 34589 1727204132.83756: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 34589 1727204132.83817: no more pending results, returning what we have 34589 1727204132.83821: results queue empty 34589 1727204132.83822: checking for any_errors_fatal 34589 1727204132.83829: done checking for any_errors_fatal 34589 1727204132.83830: checking for max_fail_percentage 34589 1727204132.83833: done checking for max_fail_percentage 34589 1727204132.83833: checking to see if all hosts have failed and the running result is not ok 34589 1727204132.83834: done checking to see if all hosts have failed 34589 1727204132.83835: getting the remaining hosts for this loop 34589 1727204132.83837: done getting the remaining hosts for this loop 34589 1727204132.83841: getting the next task for host managed-node1 34589 1727204132.83849: done getting next task for host managed-node1 34589 1727204132.83854: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34589 1727204132.83856: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204132.83870: getting variables 34589 1727204132.83872: in VariableManager get_vars() 34589 1727204132.83925: Calling all_inventory to load vars for managed-node1 34589 1727204132.83929: Calling groups_inventory to load vars for managed-node1 34589 1727204132.83931: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204132.83943: Calling all_plugins_play to load vars for managed-node1 34589 1727204132.83946: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204132.83949: Calling groups_plugins_play to load vars for managed-node1 34589 1727204132.85751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204132.87433: done with get_vars() 34589 1727204132.87468: done getting variables 34589 1727204132.87538: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:55:32 -0400 (0:00:00.103) 0:00:33.010 ***** 34589 1727204132.87578: entering _queue_task() for managed-node1/service 34589 1727204132.87958: worker is 1 (out of 1 available) 34589 1727204132.87971: exiting _queue_task() for managed-node1/service 34589 1727204132.88119: done queuing things up, now waiting for results queue to drain 34589 1727204132.88121: waiting for pending results... 34589 1727204132.88458: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34589 1727204132.88464: in run() - task 028d2410-947f-a9c6-cddc-000000000090 34589 1727204132.88467: variable 'ansible_search_path' from source: unknown 34589 1727204132.88470: variable 'ansible_search_path' from source: unknown 34589 1727204132.88505: calling self._execute() 34589 1727204132.88618: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204132.88660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204132.88663: variable 'omit' from source: magic vars 34589 1727204132.89041: variable 'ansible_distribution_major_version' from source: facts 34589 1727204132.89059: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204132.89251: variable 'network_provider' from source: set_fact 34589 1727204132.89311: variable 'network_state' from source: role '' defaults 34589 1727204132.89314: Evaluated conditional (network_provider == "nm" or network_state != {}): True 34589 1727204132.89317: variable 'omit' from source: magic vars 34589 1727204132.89333: variable 'omit' from source: magic vars 34589 1727204132.89417: variable 'network_service_name' from source: role '' defaults 34589 1727204132.89457: variable 'network_service_name' from source: role '' defaults 34589 1727204132.89577: variable '__network_provider_setup' from source: role '' defaults 34589 1727204132.89589: variable '__network_service_name_default_nm' from source: role '' defaults 34589 1727204132.89664: variable '__network_service_name_default_nm' from source: role '' defaults 34589 1727204132.89683: variable '__network_packages_default_nm' from source: role '' defaults 34589 1727204132.89754: variable '__network_packages_default_nm' from source: role '' defaults 34589 1727204132.90017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204132.92765: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204132.92831: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204132.92897: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204132.92982: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204132.92986: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204132.93066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204132.93120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204132.93152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.93206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204132.93328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204132.93331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204132.93334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204132.93347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.93393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204132.93418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204132.93671: variable '__network_packages_default_gobject_packages' from source: role '' defaults 34589 1727204132.93804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204132.93836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204132.93865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.93921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204132.93942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204132.94045: variable 'ansible_python' from source: facts 34589 1727204132.94074: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 34589 1727204132.94213: variable '__network_wpa_supplicant_required' from source: role '' defaults 34589 1727204132.94265: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 34589 1727204132.94406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204132.94446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204132.94479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.94536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204132.94580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204132.94609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204132.94650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204132.94677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.94782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204132.94785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204132.94871: variable 'network_connections' from source: play vars 34589 1727204132.94886: variable 'profile' from source: play vars 34589 1727204132.94968: variable 'profile' from source: play vars 34589 1727204132.94981: variable 'interface' from source: set_fact 34589 1727204132.95049: variable 'interface' from source: set_fact 34589 1727204132.95161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204132.95371: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204132.95440: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204132.95494: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204132.95542: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204132.95682: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204132.95685: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204132.95697: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204132.95738: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204132.95792: variable '__network_wireless_connections_defined' from source: role '' defaults 34589 1727204132.96129: variable 'network_connections' from source: play vars 34589 1727204132.96132: variable 'profile' from source: play vars 34589 1727204132.96214: variable 'profile' from source: play vars 34589 1727204132.96238: variable 'interface' from source: set_fact 34589 1727204132.96347: variable 'interface' from source: set_fact 34589 1727204132.96350: variable '__network_packages_default_wireless' from source: role '' defaults 34589 1727204132.96425: variable '__network_wireless_connections_defined' from source: role '' defaults 34589 1727204132.96757: variable 'network_connections' from source: play vars 34589 1727204132.96767: variable 'profile' from source: play vars 34589 1727204132.96846: variable 'profile' from source: play vars 34589 1727204132.96857: variable 'interface' from source: set_fact 34589 1727204132.96944: variable 'interface' from source: set_fact 34589 1727204132.96979: variable '__network_packages_default_team' from source: role '' defaults 34589 1727204132.97068: variable '__network_team_connections_defined' from source: role '' defaults 34589 1727204132.97436: variable 'network_connections' from source: play vars 34589 1727204132.97440: variable 'profile' from source: play vars 34589 1727204132.97488: variable 'profile' from source: play vars 34589 1727204132.97498: variable 'interface' from source: set_fact 34589 1727204132.97585: variable 'interface' from source: set_fact 34589 1727204132.97655: variable '__network_service_name_default_initscripts' from source: role '' defaults 34589 1727204132.97723: variable '__network_service_name_default_initscripts' from source: role '' defaults 34589 1727204132.97763: variable '__network_packages_default_initscripts' from source: role '' defaults 34589 1727204132.97806: variable '__network_packages_default_initscripts' from source: role '' defaults 34589 1727204132.98040: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 34589 1727204132.98590: variable 'network_connections' from source: play vars 34589 1727204132.98637: variable 'profile' from source: play vars 34589 1727204132.98675: variable 'profile' from source: play vars 34589 1727204132.98687: variable 'interface' from source: set_fact 34589 1727204132.98757: variable 'interface' from source: set_fact 34589 1727204132.98768: variable 'ansible_distribution' from source: facts 34589 1727204132.98774: variable '__network_rh_distros' from source: role '' defaults 34589 1727204132.98982: variable 'ansible_distribution_major_version' from source: facts 34589 1727204132.98985: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 34589 1727204132.98987: variable 'ansible_distribution' from source: facts 34589 1727204132.98989: variable '__network_rh_distros' from source: role '' defaults 34589 1727204132.98991: variable 'ansible_distribution_major_version' from source: facts 34589 1727204132.98999: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 34589 1727204132.99166: variable 'ansible_distribution' from source: facts 34589 1727204132.99174: variable '__network_rh_distros' from source: role '' defaults 34589 1727204132.99186: variable 'ansible_distribution_major_version' from source: facts 34589 1727204132.99233: variable 'network_provider' from source: set_fact 34589 1727204132.99260: variable 'omit' from source: magic vars 34589 1727204132.99294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204132.99334: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204132.99358: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204132.99385: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204132.99399: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204132.99439: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204132.99447: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204132.99454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204132.99564: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204132.99574: Set connection var ansible_shell_executable to /bin/sh 34589 1727204132.99590: Set connection var ansible_timeout to 10 34589 1727204132.99596: Set connection var ansible_shell_type to sh 34589 1727204132.99606: Set connection var ansible_connection to ssh 34589 1727204132.99618: Set connection var ansible_pipelining to False 34589 1727204132.99651: variable 'ansible_shell_executable' from source: unknown 34589 1727204132.99761: variable 'ansible_connection' from source: unknown 34589 1727204132.99764: variable 'ansible_module_compression' from source: unknown 34589 1727204132.99766: variable 'ansible_shell_type' from source: unknown 34589 1727204132.99768: variable 'ansible_shell_executable' from source: unknown 34589 1727204132.99770: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204132.99780: variable 'ansible_pipelining' from source: unknown 34589 1727204132.99782: variable 'ansible_timeout' from source: unknown 34589 1727204132.99785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204132.99822: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204132.99837: variable 'omit' from source: magic vars 34589 1727204132.99847: starting attempt loop 34589 1727204132.99852: running the handler 34589 1727204132.99941: variable 'ansible_facts' from source: unknown 34589 1727204133.00706: _low_level_execute_command(): starting 34589 1727204133.00721: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204133.01496: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204133.01554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204133.01571: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204133.01615: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204133.01730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204133.03529: stdout chunk (state=3): >>>/root <<< 34589 1727204133.03683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204133.03686: stdout chunk (state=3): >>><<< 34589 1727204133.03689: stderr chunk (state=3): >>><<< 34589 1727204133.03810: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204133.03814: _low_level_execute_command(): starting 34589 1727204133.03817: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204133.0371459-37471-136020587037661 `" && echo ansible-tmp-1727204133.0371459-37471-136020587037661="` echo /root/.ansible/tmp/ansible-tmp-1727204133.0371459-37471-136020587037661 `" ) && sleep 0' 34589 1727204133.04353: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204133.04365: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204133.04379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204133.04393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204133.04416: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204133.04426: stderr chunk (state=3): >>>debug2: match not found <<< 34589 1727204133.04437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204133.04454: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34589 1727204133.04528: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204133.04551: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204133.04565: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204133.04587: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204133.04696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204133.07125: stdout chunk (state=3): >>>ansible-tmp-1727204133.0371459-37471-136020587037661=/root/.ansible/tmp/ansible-tmp-1727204133.0371459-37471-136020587037661 <<< 34589 1727204133.07182: stdout chunk (state=3): >>><<< 34589 1727204133.07190: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204133.07199: stderr chunk (state=3): >>><<< 34589 1727204133.07218: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204133.0371459-37471-136020587037661=/root/.ansible/tmp/ansible-tmp-1727204133.0371459-37471-136020587037661 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204133.07254: variable 'ansible_module_compression' from source: unknown 34589 1727204133.07312: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 34589 1727204133.07385: variable 'ansible_facts' from source: unknown 34589 1727204133.07605: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204133.0371459-37471-136020587037661/AnsiballZ_systemd.py 34589 1727204133.07830: Sending initial data 34589 1727204133.07834: Sent initial data (156 bytes) 34589 1727204133.08340: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204133.08353: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204133.08460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204133.08477: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204133.08588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204133.10337: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 34589 1727204133.10345: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204133.10425: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204133.10500: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmpfkb6ewpx /root/.ansible/tmp/ansible-tmp-1727204133.0371459-37471-136020587037661/AnsiballZ_systemd.py <<< 34589 1727204133.10512: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204133.0371459-37471-136020587037661/AnsiballZ_systemd.py" <<< 34589 1727204133.10582: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmpfkb6ewpx" to remote "/root/.ansible/tmp/ansible-tmp-1727204133.0371459-37471-136020587037661/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204133.0371459-37471-136020587037661/AnsiballZ_systemd.py" <<< 34589 1727204133.13748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204133.14048: stderr chunk (state=3): >>><<< 34589 1727204133.14052: stdout chunk (state=3): >>><<< 34589 1727204133.14054: done transferring module to remote 34589 1727204133.14057: _low_level_execute_command(): starting 34589 1727204133.14059: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204133.0371459-37471-136020587037661/ /root/.ansible/tmp/ansible-tmp-1727204133.0371459-37471-136020587037661/AnsiballZ_systemd.py && sleep 0' 34589 1727204133.15059: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204133.15192: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204133.15292: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204133.15303: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204133.15423: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204133.17441: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204133.17445: stdout chunk (state=3): >>><<< 34589 1727204133.17682: stderr chunk (state=3): >>><<< 34589 1727204133.17686: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204133.17689: _low_level_execute_command(): starting 34589 1727204133.17691: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204133.0371459-37471-136020587037661/AnsiballZ_systemd.py && sleep 0' 34589 1727204133.18052: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204133.18061: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204133.18071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204133.18086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204133.18098: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204133.18104: stderr chunk (state=3): >>>debug2: match not found <<< 34589 1727204133.18114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204133.18137: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34589 1727204133.18144: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 34589 1727204133.18150: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 34589 1727204133.18158: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204133.18167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204133.18238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204133.18313: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204133.18316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204133.18318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204133.18320: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204133.18418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204133.49592: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10711040", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3292884992", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1491848000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 34589 1727204133.49628: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 34589 1727204133.51979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204133.52000: stdout chunk (state=3): >>><<< 34589 1727204133.52016: stderr chunk (state=3): >>><<< 34589 1727204133.52037: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10711040", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3292884992", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1491848000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204133.52249: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204133.0371459-37471-136020587037661/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204133.52271: _low_level_execute_command(): starting 34589 1727204133.52282: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204133.0371459-37471-136020587037661/ > /dev/null 2>&1 && sleep 0' 34589 1727204133.52902: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204133.52928: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204133.52946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204133.52964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204133.52985: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204133.53048: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204133.53097: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204133.53119: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204133.53154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204133.53269: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204133.55299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204133.55314: stdout chunk (state=3): >>><<< 34589 1727204133.55335: stderr chunk (state=3): >>><<< 34589 1727204133.55354: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204133.55367: handler run complete 34589 1727204133.55448: attempt loop complete, returning result 34589 1727204133.55539: _execute() done 34589 1727204133.55543: dumping result to json 34589 1727204133.55545: done dumping result, returning 34589 1727204133.55547: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [028d2410-947f-a9c6-cddc-000000000090] 34589 1727204133.55550: sending task result for task 028d2410-947f-a9c6-cddc-000000000090 ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34589 1727204133.56034: no more pending results, returning what we have 34589 1727204133.56037: results queue empty 34589 1727204133.56038: checking for any_errors_fatal 34589 1727204133.56045: done checking for any_errors_fatal 34589 1727204133.56046: checking for max_fail_percentage 34589 1727204133.56048: done checking for max_fail_percentage 34589 1727204133.56049: checking to see if all hosts have failed and the running result is not ok 34589 1727204133.56050: done checking to see if all hosts have failed 34589 1727204133.56050: getting the remaining hosts for this loop 34589 1727204133.56052: done getting the remaining hosts for this loop 34589 1727204133.56062: getting the next task for host managed-node1 34589 1727204133.56069: done getting next task for host managed-node1 34589 1727204133.56073: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34589 1727204133.56079: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204133.56180: getting variables 34589 1727204133.56182: in VariableManager get_vars() 34589 1727204133.56219: Calling all_inventory to load vars for managed-node1 34589 1727204133.56222: Calling groups_inventory to load vars for managed-node1 34589 1727204133.56225: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204133.56235: Calling all_plugins_play to load vars for managed-node1 34589 1727204133.56239: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204133.56242: Calling groups_plugins_play to load vars for managed-node1 34589 1727204133.56794: done sending task result for task 028d2410-947f-a9c6-cddc-000000000090 34589 1727204133.56798: WORKER PROCESS EXITING 34589 1727204133.58031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204133.59731: done with get_vars() 34589 1727204133.59753: done getting variables 34589 1727204133.59827: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:55:33 -0400 (0:00:00.722) 0:00:33.733 ***** 34589 1727204133.59856: entering _queue_task() for managed-node1/service 34589 1727204133.60190: worker is 1 (out of 1 available) 34589 1727204133.60214: exiting _queue_task() for managed-node1/service 34589 1727204133.60227: done queuing things up, now waiting for results queue to drain 34589 1727204133.60229: waiting for pending results... 34589 1727204133.60393: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34589 1727204133.60469: in run() - task 028d2410-947f-a9c6-cddc-000000000091 34589 1727204133.60483: variable 'ansible_search_path' from source: unknown 34589 1727204133.60488: variable 'ansible_search_path' from source: unknown 34589 1727204133.60515: calling self._execute() 34589 1727204133.60597: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204133.60601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204133.60611: variable 'omit' from source: magic vars 34589 1727204133.60880: variable 'ansible_distribution_major_version' from source: facts 34589 1727204133.60891: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204133.60969: variable 'network_provider' from source: set_fact 34589 1727204133.60973: Evaluated conditional (network_provider == "nm"): True 34589 1727204133.61038: variable '__network_wpa_supplicant_required' from source: role '' defaults 34589 1727204133.61105: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 34589 1727204133.61215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204133.63080: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204133.63131: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204133.63160: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204133.63189: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204133.63208: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204133.63282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204133.63301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204133.63322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204133.63347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204133.63358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204133.63395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204133.63414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204133.63430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204133.63454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204133.63465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204133.63497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204133.63517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204133.63534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204133.63558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204133.63568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204133.63668: variable 'network_connections' from source: play vars 34589 1727204133.63679: variable 'profile' from source: play vars 34589 1727204133.63732: variable 'profile' from source: play vars 34589 1727204133.63736: variable 'interface' from source: set_fact 34589 1727204133.63778: variable 'interface' from source: set_fact 34589 1727204133.63835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34589 1727204133.63943: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34589 1727204133.63969: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34589 1727204133.63992: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34589 1727204133.64015: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34589 1727204133.64048: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34589 1727204133.64064: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34589 1727204133.64083: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204133.64100: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34589 1727204133.64140: variable '__network_wireless_connections_defined' from source: role '' defaults 34589 1727204133.64299: variable 'network_connections' from source: play vars 34589 1727204133.64303: variable 'profile' from source: play vars 34589 1727204133.64347: variable 'profile' from source: play vars 34589 1727204133.64350: variable 'interface' from source: set_fact 34589 1727204133.64395: variable 'interface' from source: set_fact 34589 1727204133.64419: Evaluated conditional (__network_wpa_supplicant_required): False 34589 1727204133.64422: when evaluation is False, skipping this task 34589 1727204133.64425: _execute() done 34589 1727204133.64436: dumping result to json 34589 1727204133.64438: done dumping result, returning 34589 1727204133.64441: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [028d2410-947f-a9c6-cddc-000000000091] 34589 1727204133.64443: sending task result for task 028d2410-947f-a9c6-cddc-000000000091 34589 1727204133.64525: done sending task result for task 028d2410-947f-a9c6-cddc-000000000091 34589 1727204133.64528: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 34589 1727204133.64574: no more pending results, returning what we have 34589 1727204133.64579: results queue empty 34589 1727204133.64580: checking for any_errors_fatal 34589 1727204133.64599: done checking for any_errors_fatal 34589 1727204133.64599: checking for max_fail_percentage 34589 1727204133.64601: done checking for max_fail_percentage 34589 1727204133.64602: checking to see if all hosts have failed and the running result is not ok 34589 1727204133.64602: done checking to see if all hosts have failed 34589 1727204133.64603: getting the remaining hosts for this loop 34589 1727204133.64604: done getting the remaining hosts for this loop 34589 1727204133.64608: getting the next task for host managed-node1 34589 1727204133.64615: done getting next task for host managed-node1 34589 1727204133.64618: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 34589 1727204133.64620: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204133.64632: getting variables 34589 1727204133.64634: in VariableManager get_vars() 34589 1727204133.64670: Calling all_inventory to load vars for managed-node1 34589 1727204133.64673: Calling groups_inventory to load vars for managed-node1 34589 1727204133.64677: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204133.64687: Calling all_plugins_play to load vars for managed-node1 34589 1727204133.64689: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204133.64692: Calling groups_plugins_play to load vars for managed-node1 34589 1727204133.65980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204133.67202: done with get_vars() 34589 1727204133.67219: done getting variables 34589 1727204133.67259: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:55:33 -0400 (0:00:00.074) 0:00:33.807 ***** 34589 1727204133.67283: entering _queue_task() for managed-node1/service 34589 1727204133.67517: worker is 1 (out of 1 available) 34589 1727204133.67532: exiting _queue_task() for managed-node1/service 34589 1727204133.67544: done queuing things up, now waiting for results queue to drain 34589 1727204133.67545: waiting for pending results... 34589 1727204133.67718: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 34589 1727204133.67791: in run() - task 028d2410-947f-a9c6-cddc-000000000092 34589 1727204133.67804: variable 'ansible_search_path' from source: unknown 34589 1727204133.67808: variable 'ansible_search_path' from source: unknown 34589 1727204133.67838: calling self._execute() 34589 1727204133.67912: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204133.67919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204133.67926: variable 'omit' from source: magic vars 34589 1727204133.68233: variable 'ansible_distribution_major_version' from source: facts 34589 1727204133.68243: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204133.68359: variable 'network_provider' from source: set_fact 34589 1727204133.68363: Evaluated conditional (network_provider == "initscripts"): False 34589 1727204133.68366: when evaluation is False, skipping this task 34589 1727204133.68369: _execute() done 34589 1727204133.68371: dumping result to json 34589 1727204133.68373: done dumping result, returning 34589 1727204133.68379: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [028d2410-947f-a9c6-cddc-000000000092] 34589 1727204133.68391: sending task result for task 028d2410-947f-a9c6-cddc-000000000092 34589 1727204133.68511: done sending task result for task 028d2410-947f-a9c6-cddc-000000000092 34589 1727204133.68514: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34589 1727204133.68713: no more pending results, returning what we have 34589 1727204133.68716: results queue empty 34589 1727204133.68717: checking for any_errors_fatal 34589 1727204133.68723: done checking for any_errors_fatal 34589 1727204133.68723: checking for max_fail_percentage 34589 1727204133.68725: done checking for max_fail_percentage 34589 1727204133.68726: checking to see if all hosts have failed and the running result is not ok 34589 1727204133.68726: done checking to see if all hosts have failed 34589 1727204133.68727: getting the remaining hosts for this loop 34589 1727204133.68728: done getting the remaining hosts for this loop 34589 1727204133.68731: getting the next task for host managed-node1 34589 1727204133.68736: done getting next task for host managed-node1 34589 1727204133.68739: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34589 1727204133.68741: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204133.68753: getting variables 34589 1727204133.68754: in VariableManager get_vars() 34589 1727204133.68786: Calling all_inventory to load vars for managed-node1 34589 1727204133.68788: Calling groups_inventory to load vars for managed-node1 34589 1727204133.68790: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204133.68798: Calling all_plugins_play to load vars for managed-node1 34589 1727204133.68801: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204133.68804: Calling groups_plugins_play to load vars for managed-node1 34589 1727204133.69945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204133.71304: done with get_vars() 34589 1727204133.71322: done getting variables 34589 1727204133.71366: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:55:33 -0400 (0:00:00.041) 0:00:33.848 ***** 34589 1727204133.71390: entering _queue_task() for managed-node1/copy 34589 1727204133.71621: worker is 1 (out of 1 available) 34589 1727204133.71633: exiting _queue_task() for managed-node1/copy 34589 1727204133.71645: done queuing things up, now waiting for results queue to drain 34589 1727204133.71647: waiting for pending results... 34589 1727204133.71831: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34589 1727204133.71907: in run() - task 028d2410-947f-a9c6-cddc-000000000093 34589 1727204133.71921: variable 'ansible_search_path' from source: unknown 34589 1727204133.71925: variable 'ansible_search_path' from source: unknown 34589 1727204133.71952: calling self._execute() 34589 1727204133.72030: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204133.72037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204133.72045: variable 'omit' from source: magic vars 34589 1727204133.72324: variable 'ansible_distribution_major_version' from source: facts 34589 1727204133.72333: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204133.72413: variable 'network_provider' from source: set_fact 34589 1727204133.72417: Evaluated conditional (network_provider == "initscripts"): False 34589 1727204133.72420: when evaluation is False, skipping this task 34589 1727204133.72423: _execute() done 34589 1727204133.72434: dumping result to json 34589 1727204133.72437: done dumping result, returning 34589 1727204133.72440: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [028d2410-947f-a9c6-cddc-000000000093] 34589 1727204133.72443: sending task result for task 028d2410-947f-a9c6-cddc-000000000093 34589 1727204133.72525: done sending task result for task 028d2410-947f-a9c6-cddc-000000000093 34589 1727204133.72527: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 34589 1727204133.72579: no more pending results, returning what we have 34589 1727204133.72583: results queue empty 34589 1727204133.72584: checking for any_errors_fatal 34589 1727204133.72591: done checking for any_errors_fatal 34589 1727204133.72592: checking for max_fail_percentage 34589 1727204133.72593: done checking for max_fail_percentage 34589 1727204133.72594: checking to see if all hosts have failed and the running result is not ok 34589 1727204133.72595: done checking to see if all hosts have failed 34589 1727204133.72596: getting the remaining hosts for this loop 34589 1727204133.72597: done getting the remaining hosts for this loop 34589 1727204133.72600: getting the next task for host managed-node1 34589 1727204133.72606: done getting next task for host managed-node1 34589 1727204133.72609: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34589 1727204133.72611: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204133.72624: getting variables 34589 1727204133.72625: in VariableManager get_vars() 34589 1727204133.72656: Calling all_inventory to load vars for managed-node1 34589 1727204133.72659: Calling groups_inventory to load vars for managed-node1 34589 1727204133.72660: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204133.72669: Calling all_plugins_play to load vars for managed-node1 34589 1727204133.72671: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204133.72673: Calling groups_plugins_play to load vars for managed-node1 34589 1727204133.74023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204133.74892: done with get_vars() 34589 1727204133.74908: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:55:33 -0400 (0:00:00.035) 0:00:33.884 ***** 34589 1727204133.74967: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 34589 1727204133.75203: worker is 1 (out of 1 available) 34589 1727204133.75217: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 34589 1727204133.75230: done queuing things up, now waiting for results queue to drain 34589 1727204133.75231: waiting for pending results... 34589 1727204133.75407: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34589 1727204133.75476: in run() - task 028d2410-947f-a9c6-cddc-000000000094 34589 1727204133.75489: variable 'ansible_search_path' from source: unknown 34589 1727204133.75493: variable 'ansible_search_path' from source: unknown 34589 1727204133.75524: calling self._execute() 34589 1727204133.75602: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204133.75606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204133.75618: variable 'omit' from source: magic vars 34589 1727204133.75887: variable 'ansible_distribution_major_version' from source: facts 34589 1727204133.75901: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204133.75905: variable 'omit' from source: magic vars 34589 1727204133.75936: variable 'omit' from source: magic vars 34589 1727204133.76048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34589 1727204133.77479: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34589 1727204133.77528: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34589 1727204133.77555: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34589 1727204133.77581: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34589 1727204133.77600: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34589 1727204133.77656: variable 'network_provider' from source: set_fact 34589 1727204133.77748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34589 1727204133.77780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34589 1727204133.77797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34589 1727204133.77826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34589 1727204133.77836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34589 1727204133.77891: variable 'omit' from source: magic vars 34589 1727204133.77969: variable 'omit' from source: magic vars 34589 1727204133.78039: variable 'network_connections' from source: play vars 34589 1727204133.78048: variable 'profile' from source: play vars 34589 1727204133.78099: variable 'profile' from source: play vars 34589 1727204133.78103: variable 'interface' from source: set_fact 34589 1727204133.78147: variable 'interface' from source: set_fact 34589 1727204133.78247: variable 'omit' from source: magic vars 34589 1727204133.78254: variable '__lsr_ansible_managed' from source: task vars 34589 1727204133.78307: variable '__lsr_ansible_managed' from source: task vars 34589 1727204133.78492: Loaded config def from plugin (lookup/template) 34589 1727204133.78497: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 34589 1727204133.78522: File lookup term: get_ansible_managed.j2 34589 1727204133.78525: variable 'ansible_search_path' from source: unknown 34589 1727204133.78529: evaluation_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 34589 1727204133.78539: search_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 34589 1727204133.78552: variable 'ansible_search_path' from source: unknown 34589 1727204133.85922: variable 'ansible_managed' from source: unknown 34589 1727204133.85998: variable 'omit' from source: magic vars 34589 1727204133.86021: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204133.86038: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204133.86048: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204133.86059: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204133.86066: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204133.86081: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204133.86085: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204133.86087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204133.86149: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204133.86153: Set connection var ansible_shell_executable to /bin/sh 34589 1727204133.86160: Set connection var ansible_timeout to 10 34589 1727204133.86163: Set connection var ansible_shell_type to sh 34589 1727204133.86169: Set connection var ansible_connection to ssh 34589 1727204133.86173: Set connection var ansible_pipelining to False 34589 1727204133.86191: variable 'ansible_shell_executable' from source: unknown 34589 1727204133.86194: variable 'ansible_connection' from source: unknown 34589 1727204133.86196: variable 'ansible_module_compression' from source: unknown 34589 1727204133.86199: variable 'ansible_shell_type' from source: unknown 34589 1727204133.86201: variable 'ansible_shell_executable' from source: unknown 34589 1727204133.86203: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204133.86207: variable 'ansible_pipelining' from source: unknown 34589 1727204133.86213: variable 'ansible_timeout' from source: unknown 34589 1727204133.86217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204133.86301: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34589 1727204133.86312: variable 'omit' from source: magic vars 34589 1727204133.86315: starting attempt loop 34589 1727204133.86318: running the handler 34589 1727204133.86327: _low_level_execute_command(): starting 34589 1727204133.86332: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204133.86819: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204133.86822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204133.86825: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204133.86827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204133.86880: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204133.86883: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204133.86887: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204133.86977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204133.88763: stdout chunk (state=3): >>>/root <<< 34589 1727204133.88865: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204133.88895: stderr chunk (state=3): >>><<< 34589 1727204133.88899: stdout chunk (state=3): >>><<< 34589 1727204133.88916: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204133.88926: _low_level_execute_command(): starting 34589 1727204133.88937: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204133.8891637-37504-213591203410490 `" && echo ansible-tmp-1727204133.8891637-37504-213591203410490="` echo /root/.ansible/tmp/ansible-tmp-1727204133.8891637-37504-213591203410490 `" ) && sleep 0' 34589 1727204133.89345: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204133.89384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204133.89387: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204133.89390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204133.89392: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204133.89394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204133.89441: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204133.89444: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204133.89449: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204133.89528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204133.91608: stdout chunk (state=3): >>>ansible-tmp-1727204133.8891637-37504-213591203410490=/root/.ansible/tmp/ansible-tmp-1727204133.8891637-37504-213591203410490 <<< 34589 1727204133.91720: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204133.91747: stderr chunk (state=3): >>><<< 34589 1727204133.91750: stdout chunk (state=3): >>><<< 34589 1727204133.91764: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204133.8891637-37504-213591203410490=/root/.ansible/tmp/ansible-tmp-1727204133.8891637-37504-213591203410490 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204133.91797: variable 'ansible_module_compression' from source: unknown 34589 1727204133.91829: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 34589 1727204133.91870: variable 'ansible_facts' from source: unknown 34589 1727204133.91961: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204133.8891637-37504-213591203410490/AnsiballZ_network_connections.py 34589 1727204133.92053: Sending initial data 34589 1727204133.92056: Sent initial data (168 bytes) 34589 1727204133.92469: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204133.92479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204133.92503: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204133.92506: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204133.92511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204133.92565: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204133.92572: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204133.92575: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204133.92650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204133.94378: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 34589 1727204133.94381: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204133.94452: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204133.94529: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmp2y5qbgcy /root/.ansible/tmp/ansible-tmp-1727204133.8891637-37504-213591203410490/AnsiballZ_network_connections.py <<< 34589 1727204133.94532: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204133.8891637-37504-213591203410490/AnsiballZ_network_connections.py" <<< 34589 1727204133.94604: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmp2y5qbgcy" to remote "/root/.ansible/tmp/ansible-tmp-1727204133.8891637-37504-213591203410490/AnsiballZ_network_connections.py" <<< 34589 1727204133.94607: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204133.8891637-37504-213591203410490/AnsiballZ_network_connections.py" <<< 34589 1727204133.95456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204133.95490: stderr chunk (state=3): >>><<< 34589 1727204133.95493: stdout chunk (state=3): >>><<< 34589 1727204133.95516: done transferring module to remote 34589 1727204133.95524: _low_level_execute_command(): starting 34589 1727204133.95528: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204133.8891637-37504-213591203410490/ /root/.ansible/tmp/ansible-tmp-1727204133.8891637-37504-213591203410490/AnsiballZ_network_connections.py && sleep 0' 34589 1727204133.95941: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204133.95945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204133.95947: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 34589 1727204133.95949: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204133.95951: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204133.96000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204133.96008: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204133.96087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204133.98062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204133.98086: stderr chunk (state=3): >>><<< 34589 1727204133.98089: stdout chunk (state=3): >>><<< 34589 1727204133.98103: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204133.98107: _low_level_execute_command(): starting 34589 1727204133.98113: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204133.8891637-37504-213591203410490/AnsiballZ_network_connections.py && sleep 0' 34589 1727204133.98533: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204133.98536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204133.98538: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204133.98540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204133.98601: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204133.98603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204133.98685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204134.28818: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_9i8n2erl/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_9i8n2erl/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/0a64492c-4969-466c-9920-91c73029e796: error=unknown <<< 34589 1727204134.29018: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 34589 1727204134.31391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204134.31395: stdout chunk (state=3): >>><<< 34589 1727204134.31398: stderr chunk (state=3): >>><<< 34589 1727204134.31400: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_9i8n2erl/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_9i8n2erl/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/0a64492c-4969-466c-9920-91c73029e796: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204134.31403: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204133.8891637-37504-213591203410490/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204134.31406: _low_level_execute_command(): starting 34589 1727204134.31411: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204133.8891637-37504-213591203410490/ > /dev/null 2>&1 && sleep 0' 34589 1727204134.32183: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204134.32215: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204134.32334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204134.34379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204134.34383: stdout chunk (state=3): >>><<< 34589 1727204134.34385: stderr chunk (state=3): >>><<< 34589 1727204134.34581: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204134.34589: handler run complete 34589 1727204134.34591: attempt loop complete, returning result 34589 1727204134.34594: _execute() done 34589 1727204134.34596: dumping result to json 34589 1727204134.34598: done dumping result, returning 34589 1727204134.34600: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [028d2410-947f-a9c6-cddc-000000000094] 34589 1727204134.34602: sending task result for task 028d2410-947f-a9c6-cddc-000000000094 34589 1727204134.34677: done sending task result for task 028d2410-947f-a9c6-cddc-000000000094 34589 1727204134.34681: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 34589 1727204134.34785: no more pending results, returning what we have 34589 1727204134.34789: results queue empty 34589 1727204134.34793: checking for any_errors_fatal 34589 1727204134.34800: done checking for any_errors_fatal 34589 1727204134.34801: checking for max_fail_percentage 34589 1727204134.34802: done checking for max_fail_percentage 34589 1727204134.34803: checking to see if all hosts have failed and the running result is not ok 34589 1727204134.34804: done checking to see if all hosts have failed 34589 1727204134.34805: getting the remaining hosts for this loop 34589 1727204134.34806: done getting the remaining hosts for this loop 34589 1727204134.34812: getting the next task for host managed-node1 34589 1727204134.34818: done getting next task for host managed-node1 34589 1727204134.34822: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 34589 1727204134.34824: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204134.34833: getting variables 34589 1727204134.34835: in VariableManager get_vars() 34589 1727204134.34870: Calling all_inventory to load vars for managed-node1 34589 1727204134.34872: Calling groups_inventory to load vars for managed-node1 34589 1727204134.34875: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204134.35017: Calling all_plugins_play to load vars for managed-node1 34589 1727204134.35020: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204134.35023: Calling groups_plugins_play to load vars for managed-node1 34589 1727204134.36932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204134.38502: done with get_vars() 34589 1727204134.38537: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:55:34 -0400 (0:00:00.636) 0:00:34.521 ***** 34589 1727204134.38633: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 34589 1727204134.39083: worker is 1 (out of 1 available) 34589 1727204134.39097: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 34589 1727204134.39112: done queuing things up, now waiting for results queue to drain 34589 1727204134.39114: waiting for pending results... 34589 1727204134.39498: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 34589 1727204134.39512: in run() - task 028d2410-947f-a9c6-cddc-000000000095 34589 1727204134.39527: variable 'ansible_search_path' from source: unknown 34589 1727204134.39536: variable 'ansible_search_path' from source: unknown 34589 1727204134.39583: calling self._execute() 34589 1727204134.39717: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204134.39741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204134.39756: variable 'omit' from source: magic vars 34589 1727204134.40247: variable 'ansible_distribution_major_version' from source: facts 34589 1727204134.40249: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204134.40386: variable 'network_state' from source: role '' defaults 34589 1727204134.40405: Evaluated conditional (network_state != {}): False 34589 1727204134.40418: when evaluation is False, skipping this task 34589 1727204134.40427: _execute() done 34589 1727204134.40434: dumping result to json 34589 1727204134.40465: done dumping result, returning 34589 1727204134.40469: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [028d2410-947f-a9c6-cddc-000000000095] 34589 1727204134.40471: sending task result for task 028d2410-947f-a9c6-cddc-000000000095 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 34589 1727204134.40737: no more pending results, returning what we have 34589 1727204134.40742: results queue empty 34589 1727204134.40743: checking for any_errors_fatal 34589 1727204134.40758: done checking for any_errors_fatal 34589 1727204134.40759: checking for max_fail_percentage 34589 1727204134.40761: done checking for max_fail_percentage 34589 1727204134.40762: checking to see if all hosts have failed and the running result is not ok 34589 1727204134.40763: done checking to see if all hosts have failed 34589 1727204134.40763: getting the remaining hosts for this loop 34589 1727204134.40765: done getting the remaining hosts for this loop 34589 1727204134.40768: getting the next task for host managed-node1 34589 1727204134.40978: done getting next task for host managed-node1 34589 1727204134.40984: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34589 1727204134.40986: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204134.41004: getting variables 34589 1727204134.41006: in VariableManager get_vars() 34589 1727204134.41042: Calling all_inventory to load vars for managed-node1 34589 1727204134.41045: Calling groups_inventory to load vars for managed-node1 34589 1727204134.41047: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204134.41058: Calling all_plugins_play to load vars for managed-node1 34589 1727204134.41060: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204134.41064: Calling groups_plugins_play to load vars for managed-node1 34589 1727204134.41583: done sending task result for task 028d2410-947f-a9c6-cddc-000000000095 34589 1727204134.41587: WORKER PROCESS EXITING 34589 1727204134.42561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204134.43957: done with get_vars() 34589 1727204134.43973: done getting variables 34589 1727204134.44025: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:55:34 -0400 (0:00:00.054) 0:00:34.575 ***** 34589 1727204134.44048: entering _queue_task() for managed-node1/debug 34589 1727204134.44293: worker is 1 (out of 1 available) 34589 1727204134.44308: exiting _queue_task() for managed-node1/debug 34589 1727204134.44321: done queuing things up, now waiting for results queue to drain 34589 1727204134.44323: waiting for pending results... 34589 1727204134.44506: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34589 1727204134.44580: in run() - task 028d2410-947f-a9c6-cddc-000000000096 34589 1727204134.44593: variable 'ansible_search_path' from source: unknown 34589 1727204134.44597: variable 'ansible_search_path' from source: unknown 34589 1727204134.44629: calling self._execute() 34589 1727204134.44704: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204134.44711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204134.44721: variable 'omit' from source: magic vars 34589 1727204134.44997: variable 'ansible_distribution_major_version' from source: facts 34589 1727204134.45006: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204134.45014: variable 'omit' from source: magic vars 34589 1727204134.45042: variable 'omit' from source: magic vars 34589 1727204134.45066: variable 'omit' from source: magic vars 34589 1727204134.45106: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204134.45133: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204134.45148: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204134.45161: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204134.45170: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204134.45194: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204134.45198: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204134.45202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204134.45271: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204134.45274: Set connection var ansible_shell_executable to /bin/sh 34589 1727204134.45284: Set connection var ansible_timeout to 10 34589 1727204134.45286: Set connection var ansible_shell_type to sh 34589 1727204134.45292: Set connection var ansible_connection to ssh 34589 1727204134.45297: Set connection var ansible_pipelining to False 34589 1727204134.45319: variable 'ansible_shell_executable' from source: unknown 34589 1727204134.45322: variable 'ansible_connection' from source: unknown 34589 1727204134.45325: variable 'ansible_module_compression' from source: unknown 34589 1727204134.45327: variable 'ansible_shell_type' from source: unknown 34589 1727204134.45329: variable 'ansible_shell_executable' from source: unknown 34589 1727204134.45331: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204134.45336: variable 'ansible_pipelining' from source: unknown 34589 1727204134.45338: variable 'ansible_timeout' from source: unknown 34589 1727204134.45342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204134.45447: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204134.45456: variable 'omit' from source: magic vars 34589 1727204134.45461: starting attempt loop 34589 1727204134.45463: running the handler 34589 1727204134.45572: variable '__network_connections_result' from source: set_fact 34589 1727204134.45695: handler run complete 34589 1727204134.45699: attempt loop complete, returning result 34589 1727204134.45701: _execute() done 34589 1727204134.45704: dumping result to json 34589 1727204134.45706: done dumping result, returning 34589 1727204134.45710: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [028d2410-947f-a9c6-cddc-000000000096] 34589 1727204134.45712: sending task result for task 028d2410-947f-a9c6-cddc-000000000096 34589 1727204134.45769: done sending task result for task 028d2410-947f-a9c6-cddc-000000000096 34589 1727204134.45771: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "" ] } 34589 1727204134.45847: no more pending results, returning what we have 34589 1727204134.45851: results queue empty 34589 1727204134.45851: checking for any_errors_fatal 34589 1727204134.45858: done checking for any_errors_fatal 34589 1727204134.45858: checking for max_fail_percentage 34589 1727204134.45860: done checking for max_fail_percentage 34589 1727204134.45861: checking to see if all hosts have failed and the running result is not ok 34589 1727204134.45862: done checking to see if all hosts have failed 34589 1727204134.45862: getting the remaining hosts for this loop 34589 1727204134.45864: done getting the remaining hosts for this loop 34589 1727204134.45867: getting the next task for host managed-node1 34589 1727204134.45871: done getting next task for host managed-node1 34589 1727204134.45874: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34589 1727204134.45878: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204134.45888: getting variables 34589 1727204134.45890: in VariableManager get_vars() 34589 1727204134.45922: Calling all_inventory to load vars for managed-node1 34589 1727204134.45924: Calling groups_inventory to load vars for managed-node1 34589 1727204134.45927: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204134.45934: Calling all_plugins_play to load vars for managed-node1 34589 1727204134.45936: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204134.45939: Calling groups_plugins_play to load vars for managed-node1 34589 1727204134.47094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204134.48054: done with get_vars() 34589 1727204134.48070: done getting variables 34589 1727204134.48113: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:55:34 -0400 (0:00:00.040) 0:00:34.616 ***** 34589 1727204134.48134: entering _queue_task() for managed-node1/debug 34589 1727204134.48367: worker is 1 (out of 1 available) 34589 1727204134.48381: exiting _queue_task() for managed-node1/debug 34589 1727204134.48393: done queuing things up, now waiting for results queue to drain 34589 1727204134.48395: waiting for pending results... 34589 1727204134.48573: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34589 1727204134.48645: in run() - task 028d2410-947f-a9c6-cddc-000000000097 34589 1727204134.48657: variable 'ansible_search_path' from source: unknown 34589 1727204134.48660: variable 'ansible_search_path' from source: unknown 34589 1727204134.48692: calling self._execute() 34589 1727204134.48769: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204134.48774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204134.48784: variable 'omit' from source: magic vars 34589 1727204134.49194: variable 'ansible_distribution_major_version' from source: facts 34589 1727204134.49198: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204134.49200: variable 'omit' from source: magic vars 34589 1727204134.49213: variable 'omit' from source: magic vars 34589 1727204134.49249: variable 'omit' from source: magic vars 34589 1727204134.49292: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204134.49337: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204134.49363: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204134.49386: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204134.49412: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204134.49445: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204134.49481: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204134.49484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204134.49573: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204134.49588: Set connection var ansible_shell_executable to /bin/sh 34589 1727204134.49602: Set connection var ansible_timeout to 10 34589 1727204134.49627: Set connection var ansible_shell_type to sh 34589 1727204134.49630: Set connection var ansible_connection to ssh 34589 1727204134.49634: Set connection var ansible_pipelining to False 34589 1727204134.49659: variable 'ansible_shell_executable' from source: unknown 34589 1727204134.49680: variable 'ansible_connection' from source: unknown 34589 1727204134.49684: variable 'ansible_module_compression' from source: unknown 34589 1727204134.49686: variable 'ansible_shell_type' from source: unknown 34589 1727204134.49688: variable 'ansible_shell_executable' from source: unknown 34589 1727204134.49736: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204134.49739: variable 'ansible_pipelining' from source: unknown 34589 1727204134.49742: variable 'ansible_timeout' from source: unknown 34589 1727204134.49744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204134.49866: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204134.49887: variable 'omit' from source: magic vars 34589 1727204134.49898: starting attempt loop 34589 1727204134.49906: running the handler 34589 1727204134.49964: variable '__network_connections_result' from source: set_fact 34589 1727204134.50063: variable '__network_connections_result' from source: set_fact 34589 1727204134.50156: handler run complete 34589 1727204134.50194: attempt loop complete, returning result 34589 1727204134.50279: _execute() done 34589 1727204134.50283: dumping result to json 34589 1727204134.50285: done dumping result, returning 34589 1727204134.50288: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [028d2410-947f-a9c6-cddc-000000000097] 34589 1727204134.50290: sending task result for task 028d2410-947f-a9c6-cddc-000000000097 34589 1727204134.50360: done sending task result for task 028d2410-947f-a9c6-cddc-000000000097 34589 1727204134.50363: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 34589 1727204134.50449: no more pending results, returning what we have 34589 1727204134.50452: results queue empty 34589 1727204134.50453: checking for any_errors_fatal 34589 1727204134.50460: done checking for any_errors_fatal 34589 1727204134.50461: checking for max_fail_percentage 34589 1727204134.50462: done checking for max_fail_percentage 34589 1727204134.50463: checking to see if all hosts have failed and the running result is not ok 34589 1727204134.50464: done checking to see if all hosts have failed 34589 1727204134.50465: getting the remaining hosts for this loop 34589 1727204134.50466: done getting the remaining hosts for this loop 34589 1727204134.50469: getting the next task for host managed-node1 34589 1727204134.50477: done getting next task for host managed-node1 34589 1727204134.50480: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34589 1727204134.50482: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204134.50495: getting variables 34589 1727204134.50497: in VariableManager get_vars() 34589 1727204134.50531: Calling all_inventory to load vars for managed-node1 34589 1727204134.50534: Calling groups_inventory to load vars for managed-node1 34589 1727204134.50537: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204134.50548: Calling all_plugins_play to load vars for managed-node1 34589 1727204134.50552: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204134.50555: Calling groups_plugins_play to load vars for managed-node1 34589 1727204134.55301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204134.56181: done with get_vars() 34589 1727204134.56197: done getting variables 34589 1727204134.56234: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:55:34 -0400 (0:00:00.081) 0:00:34.697 ***** 34589 1727204134.56256: entering _queue_task() for managed-node1/debug 34589 1727204134.56624: worker is 1 (out of 1 available) 34589 1727204134.56638: exiting _queue_task() for managed-node1/debug 34589 1727204134.56649: done queuing things up, now waiting for results queue to drain 34589 1727204134.56651: waiting for pending results... 34589 1727204134.57113: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34589 1727204134.57119: in run() - task 028d2410-947f-a9c6-cddc-000000000098 34589 1727204134.57123: variable 'ansible_search_path' from source: unknown 34589 1727204134.57127: variable 'ansible_search_path' from source: unknown 34589 1727204134.57186: calling self._execute() 34589 1727204134.57299: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204134.57304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204134.57307: variable 'omit' from source: magic vars 34589 1727204134.57605: variable 'ansible_distribution_major_version' from source: facts 34589 1727204134.57617: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204134.57704: variable 'network_state' from source: role '' defaults 34589 1727204134.57715: Evaluated conditional (network_state != {}): False 34589 1727204134.57718: when evaluation is False, skipping this task 34589 1727204134.57722: _execute() done 34589 1727204134.57725: dumping result to json 34589 1727204134.57728: done dumping result, returning 34589 1727204134.57735: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [028d2410-947f-a9c6-cddc-000000000098] 34589 1727204134.57742: sending task result for task 028d2410-947f-a9c6-cddc-000000000098 34589 1727204134.57829: done sending task result for task 028d2410-947f-a9c6-cddc-000000000098 34589 1727204134.57832: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "network_state != {}" } 34589 1727204134.57885: no more pending results, returning what we have 34589 1727204134.57889: results queue empty 34589 1727204134.57890: checking for any_errors_fatal 34589 1727204134.57904: done checking for any_errors_fatal 34589 1727204134.57905: checking for max_fail_percentage 34589 1727204134.57906: done checking for max_fail_percentage 34589 1727204134.57907: checking to see if all hosts have failed and the running result is not ok 34589 1727204134.57908: done checking to see if all hosts have failed 34589 1727204134.57909: getting the remaining hosts for this loop 34589 1727204134.57910: done getting the remaining hosts for this loop 34589 1727204134.57914: getting the next task for host managed-node1 34589 1727204134.57919: done getting next task for host managed-node1 34589 1727204134.57923: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 34589 1727204134.57925: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204134.57938: getting variables 34589 1727204134.57940: in VariableManager get_vars() 34589 1727204134.57986: Calling all_inventory to load vars for managed-node1 34589 1727204134.57989: Calling groups_inventory to load vars for managed-node1 34589 1727204134.57991: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204134.57999: Calling all_plugins_play to load vars for managed-node1 34589 1727204134.58002: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204134.58004: Calling groups_plugins_play to load vars for managed-node1 34589 1727204134.58873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204134.59760: done with get_vars() 34589 1727204134.59775: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:55:34 -0400 (0:00:00.035) 0:00:34.733 ***** 34589 1727204134.59844: entering _queue_task() for managed-node1/ping 34589 1727204134.60081: worker is 1 (out of 1 available) 34589 1727204134.60095: exiting _queue_task() for managed-node1/ping 34589 1727204134.60108: done queuing things up, now waiting for results queue to drain 34589 1727204134.60109: waiting for pending results... 34589 1727204134.60288: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 34589 1727204134.60364: in run() - task 028d2410-947f-a9c6-cddc-000000000099 34589 1727204134.60377: variable 'ansible_search_path' from source: unknown 34589 1727204134.60381: variable 'ansible_search_path' from source: unknown 34589 1727204134.60408: calling self._execute() 34589 1727204134.60489: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204134.60493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204134.60502: variable 'omit' from source: magic vars 34589 1727204134.60778: variable 'ansible_distribution_major_version' from source: facts 34589 1727204134.60789: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204134.60794: variable 'omit' from source: magic vars 34589 1727204134.60826: variable 'omit' from source: magic vars 34589 1727204134.60848: variable 'omit' from source: magic vars 34589 1727204134.60886: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204134.60913: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204134.60930: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204134.60942: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204134.60951: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204134.60974: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204134.60979: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204134.60982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204134.61053: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204134.61057: Set connection var ansible_shell_executable to /bin/sh 34589 1727204134.61064: Set connection var ansible_timeout to 10 34589 1727204134.61067: Set connection var ansible_shell_type to sh 34589 1727204134.61072: Set connection var ansible_connection to ssh 34589 1727204134.61079: Set connection var ansible_pipelining to False 34589 1727204134.61101: variable 'ansible_shell_executable' from source: unknown 34589 1727204134.61104: variable 'ansible_connection' from source: unknown 34589 1727204134.61108: variable 'ansible_module_compression' from source: unknown 34589 1727204134.61110: variable 'ansible_shell_type' from source: unknown 34589 1727204134.61112: variable 'ansible_shell_executable' from source: unknown 34589 1727204134.61115: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204134.61118: variable 'ansible_pipelining' from source: unknown 34589 1727204134.61120: variable 'ansible_timeout' from source: unknown 34589 1727204134.61123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204134.61268: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34589 1727204134.61278: variable 'omit' from source: magic vars 34589 1727204134.61283: starting attempt loop 34589 1727204134.61286: running the handler 34589 1727204134.61298: _low_level_execute_command(): starting 34589 1727204134.61305: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204134.61827: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204134.61830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204134.61835: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 34589 1727204134.61838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204134.61881: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204134.61886: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204134.61905: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204134.61983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204134.63783: stdout chunk (state=3): >>>/root <<< 34589 1727204134.63886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204134.63915: stderr chunk (state=3): >>><<< 34589 1727204134.63919: stdout chunk (state=3): >>><<< 34589 1727204134.63939: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204134.63953: _low_level_execute_command(): starting 34589 1727204134.63958: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204134.6393895-37534-130308921867817 `" && echo ansible-tmp-1727204134.6393895-37534-130308921867817="` echo /root/.ansible/tmp/ansible-tmp-1727204134.6393895-37534-130308921867817 `" ) && sleep 0' 34589 1727204134.64371: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204134.64409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204134.64412: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204134.64415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34589 1727204134.64424: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204134.64426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204134.64468: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204134.64474: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204134.64478: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204134.64551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204134.66653: stdout chunk (state=3): >>>ansible-tmp-1727204134.6393895-37534-130308921867817=/root/.ansible/tmp/ansible-tmp-1727204134.6393895-37534-130308921867817 <<< 34589 1727204134.66762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204134.66793: stderr chunk (state=3): >>><<< 34589 1727204134.66796: stdout chunk (state=3): >>><<< 34589 1727204134.66809: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204134.6393895-37534-130308921867817=/root/.ansible/tmp/ansible-tmp-1727204134.6393895-37534-130308921867817 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204134.66848: variable 'ansible_module_compression' from source: unknown 34589 1727204134.66882: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 34589 1727204134.66915: variable 'ansible_facts' from source: unknown 34589 1727204134.66961: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204134.6393895-37534-130308921867817/AnsiballZ_ping.py 34589 1727204134.67062: Sending initial data 34589 1727204134.67065: Sent initial data (153 bytes) 34589 1727204134.67495: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204134.67498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204134.67501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34589 1727204134.67503: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204134.67505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204134.67550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204134.67554: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204134.67565: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204134.67647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204134.69384: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 34589 1727204134.69388: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204134.69456: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204134.69532: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmpzg12950p /root/.ansible/tmp/ansible-tmp-1727204134.6393895-37534-130308921867817/AnsiballZ_ping.py <<< 34589 1727204134.69538: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204134.6393895-37534-130308921867817/AnsiballZ_ping.py" <<< 34589 1727204134.69608: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmpzg12950p" to remote "/root/.ansible/tmp/ansible-tmp-1727204134.6393895-37534-130308921867817/AnsiballZ_ping.py" <<< 34589 1727204134.69611: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204134.6393895-37534-130308921867817/AnsiballZ_ping.py" <<< 34589 1727204134.70267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204134.70305: stderr chunk (state=3): >>><<< 34589 1727204134.70310: stdout chunk (state=3): >>><<< 34589 1727204134.70329: done transferring module to remote 34589 1727204134.70339: _low_level_execute_command(): starting 34589 1727204134.70343: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204134.6393895-37534-130308921867817/ /root/.ansible/tmp/ansible-tmp-1727204134.6393895-37534-130308921867817/AnsiballZ_ping.py && sleep 0' 34589 1727204134.70750: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204134.70790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204134.70793: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204134.70796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34589 1727204134.70798: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204134.70803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204134.70843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204134.70847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204134.70931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204134.72895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204134.72920: stderr chunk (state=3): >>><<< 34589 1727204134.72923: stdout chunk (state=3): >>><<< 34589 1727204134.72938: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204134.72942: _low_level_execute_command(): starting 34589 1727204134.72945: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204134.6393895-37534-130308921867817/AnsiballZ_ping.py && sleep 0' 34589 1727204134.73336: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204134.73365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204134.73368: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204134.73370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204134.73372: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204134.73374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204134.73427: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204134.73430: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204134.73521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204134.90111: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 34589 1727204134.91754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204134.91758: stdout chunk (state=3): >>><<< 34589 1727204134.91764: stderr chunk (state=3): >>><<< 34589 1727204134.91785: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204134.91844: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204134.6393895-37534-130308921867817/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204134.91853: _low_level_execute_command(): starting 34589 1727204134.91855: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204134.6393895-37534-130308921867817/ > /dev/null 2>&1 && sleep 0' 34589 1727204134.93162: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204134.93330: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204134.93476: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204134.93690: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204134.95781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204134.95893: stderr chunk (state=3): >>><<< 34589 1727204134.95896: stdout chunk (state=3): >>><<< 34589 1727204134.95916: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204134.95981: handler run complete 34589 1727204134.95984: attempt loop complete, returning result 34589 1727204134.95987: _execute() done 34589 1727204134.95989: dumping result to json 34589 1727204134.95991: done dumping result, returning 34589 1727204134.95993: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [028d2410-947f-a9c6-cddc-000000000099] 34589 1727204134.95996: sending task result for task 028d2410-947f-a9c6-cddc-000000000099 34589 1727204134.96281: done sending task result for task 028d2410-947f-a9c6-cddc-000000000099 34589 1727204134.96284: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 34589 1727204134.96342: no more pending results, returning what we have 34589 1727204134.96345: results queue empty 34589 1727204134.96346: checking for any_errors_fatal 34589 1727204134.96352: done checking for any_errors_fatal 34589 1727204134.96353: checking for max_fail_percentage 34589 1727204134.96354: done checking for max_fail_percentage 34589 1727204134.96355: checking to see if all hosts have failed and the running result is not ok 34589 1727204134.96356: done checking to see if all hosts have failed 34589 1727204134.96357: getting the remaining hosts for this loop 34589 1727204134.96358: done getting the remaining hosts for this loop 34589 1727204134.96361: getting the next task for host managed-node1 34589 1727204134.96368: done getting next task for host managed-node1 34589 1727204134.96370: ^ task is: TASK: meta (role_complete) 34589 1727204134.96371: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204134.96382: getting variables 34589 1727204134.96384: in VariableManager get_vars() 34589 1727204134.96422: Calling all_inventory to load vars for managed-node1 34589 1727204134.96425: Calling groups_inventory to load vars for managed-node1 34589 1727204134.96427: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204134.96436: Calling all_plugins_play to load vars for managed-node1 34589 1727204134.96439: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204134.96441: Calling groups_plugins_play to load vars for managed-node1 34589 1727204134.99390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204135.02921: done with get_vars() 34589 1727204135.02950: done getting variables 34589 1727204135.03151: done queuing things up, now waiting for results queue to drain 34589 1727204135.03153: results queue empty 34589 1727204135.03154: checking for any_errors_fatal 34589 1727204135.03157: done checking for any_errors_fatal 34589 1727204135.03158: checking for max_fail_percentage 34589 1727204135.03159: done checking for max_fail_percentage 34589 1727204135.03160: checking to see if all hosts have failed and the running result is not ok 34589 1727204135.03161: done checking to see if all hosts have failed 34589 1727204135.03161: getting the remaining hosts for this loop 34589 1727204135.03162: done getting the remaining hosts for this loop 34589 1727204135.03165: getting the next task for host managed-node1 34589 1727204135.03169: done getting next task for host managed-node1 34589 1727204135.03171: ^ task is: TASK: meta (flush_handlers) 34589 1727204135.03173: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204135.03178: getting variables 34589 1727204135.03179: in VariableManager get_vars() 34589 1727204135.03277: Calling all_inventory to load vars for managed-node1 34589 1727204135.03281: Calling groups_inventory to load vars for managed-node1 34589 1727204135.03284: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204135.03290: Calling all_plugins_play to load vars for managed-node1 34589 1727204135.03292: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204135.03295: Calling groups_plugins_play to load vars for managed-node1 34589 1727204135.06216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204135.10324: done with get_vars() 34589 1727204135.10354: done getting variables 34589 1727204135.10414: in VariableManager get_vars() 34589 1727204135.10429: Calling all_inventory to load vars for managed-node1 34589 1727204135.10431: Calling groups_inventory to load vars for managed-node1 34589 1727204135.10433: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204135.10439: Calling all_plugins_play to load vars for managed-node1 34589 1727204135.10441: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204135.10444: Calling groups_plugins_play to load vars for managed-node1 34589 1727204135.12844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204135.16378: done with get_vars() 34589 1727204135.16417: done queuing things up, now waiting for results queue to drain 34589 1727204135.16420: results queue empty 34589 1727204135.16421: checking for any_errors_fatal 34589 1727204135.16422: done checking for any_errors_fatal 34589 1727204135.16423: checking for max_fail_percentage 34589 1727204135.16424: done checking for max_fail_percentage 34589 1727204135.16425: checking to see if all hosts have failed and the running result is not ok 34589 1727204135.16426: done checking to see if all hosts have failed 34589 1727204135.16426: getting the remaining hosts for this loop 34589 1727204135.16427: done getting the remaining hosts for this loop 34589 1727204135.16430: getting the next task for host managed-node1 34589 1727204135.16434: done getting next task for host managed-node1 34589 1727204135.16436: ^ task is: TASK: meta (flush_handlers) 34589 1727204135.16437: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204135.16446: getting variables 34589 1727204135.16447: in VariableManager get_vars() 34589 1727204135.16461: Calling all_inventory to load vars for managed-node1 34589 1727204135.16464: Calling groups_inventory to load vars for managed-node1 34589 1727204135.16466: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204135.16472: Calling all_plugins_play to load vars for managed-node1 34589 1727204135.16474: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204135.16683: Calling groups_plugins_play to load vars for managed-node1 34589 1727204135.18423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204135.20596: done with get_vars() 34589 1727204135.20738: done getting variables 34589 1727204135.20796: in VariableManager get_vars() 34589 1727204135.20811: Calling all_inventory to load vars for managed-node1 34589 1727204135.20814: Calling groups_inventory to load vars for managed-node1 34589 1727204135.20816: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204135.20822: Calling all_plugins_play to load vars for managed-node1 34589 1727204135.20824: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204135.20827: Calling groups_plugins_play to load vars for managed-node1 34589 1727204135.23466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204135.25231: done with get_vars() 34589 1727204135.25266: done queuing things up, now waiting for results queue to drain 34589 1727204135.25268: results queue empty 34589 1727204135.25269: checking for any_errors_fatal 34589 1727204135.25270: done checking for any_errors_fatal 34589 1727204135.25271: checking for max_fail_percentage 34589 1727204135.25272: done checking for max_fail_percentage 34589 1727204135.25273: checking to see if all hosts have failed and the running result is not ok 34589 1727204135.25274: done checking to see if all hosts have failed 34589 1727204135.25274: getting the remaining hosts for this loop 34589 1727204135.25278: done getting the remaining hosts for this loop 34589 1727204135.25282: getting the next task for host managed-node1 34589 1727204135.25285: done getting next task for host managed-node1 34589 1727204135.25286: ^ task is: None 34589 1727204135.25288: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204135.25289: done queuing things up, now waiting for results queue to drain 34589 1727204135.25290: results queue empty 34589 1727204135.25290: checking for any_errors_fatal 34589 1727204135.25291: done checking for any_errors_fatal 34589 1727204135.25292: checking for max_fail_percentage 34589 1727204135.25293: done checking for max_fail_percentage 34589 1727204135.25294: checking to see if all hosts have failed and the running result is not ok 34589 1727204135.25294: done checking to see if all hosts have failed 34589 1727204135.25296: getting the next task for host managed-node1 34589 1727204135.25298: done getting next task for host managed-node1 34589 1727204135.25299: ^ task is: None 34589 1727204135.25300: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204135.25483: in VariableManager get_vars() 34589 1727204135.25501: done with get_vars() 34589 1727204135.25507: in VariableManager get_vars() 34589 1727204135.25518: done with get_vars() 34589 1727204135.25523: variable 'omit' from source: magic vars 34589 1727204135.25553: in VariableManager get_vars() 34589 1727204135.25562: done with get_vars() 34589 1727204135.25696: variable 'omit' from source: magic vars PLAY [Delete the interface, then assert that device and profile are absent] **** 34589 1727204135.26152: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 34589 1727204135.26281: getting the remaining hosts for this loop 34589 1727204135.26283: done getting the remaining hosts for this loop 34589 1727204135.26285: getting the next task for host managed-node1 34589 1727204135.26288: done getting next task for host managed-node1 34589 1727204135.26290: ^ task is: TASK: Gathering Facts 34589 1727204135.26291: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204135.26293: getting variables 34589 1727204135.26294: in VariableManager get_vars() 34589 1727204135.26304: Calling all_inventory to load vars for managed-node1 34589 1727204135.26306: Calling groups_inventory to load vars for managed-node1 34589 1727204135.26308: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204135.26314: Calling all_plugins_play to load vars for managed-node1 34589 1727204135.26316: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204135.26318: Calling groups_plugins_play to load vars for managed-node1 34589 1727204135.29929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204135.33522: done with get_vars() 34589 1727204135.33549: done getting variables 34589 1727204135.33715: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:80 Tuesday 24 September 2024 14:55:35 -0400 (0:00:00.738) 0:00:35.472 ***** 34589 1727204135.33739: entering _queue_task() for managed-node1/gather_facts 34589 1727204135.34487: worker is 1 (out of 1 available) 34589 1727204135.34497: exiting _queue_task() for managed-node1/gather_facts 34589 1727204135.34508: done queuing things up, now waiting for results queue to drain 34589 1727204135.34510: waiting for pending results... 34589 1727204135.34958: running TaskExecutor() for managed-node1/TASK: Gathering Facts 34589 1727204135.35189: in run() - task 028d2410-947f-a9c6-cddc-0000000005ee 34589 1727204135.35216: variable 'ansible_search_path' from source: unknown 34589 1727204135.35337: calling self._execute() 34589 1727204135.35692: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204135.35696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204135.35698: variable 'omit' from source: magic vars 34589 1727204135.36401: variable 'ansible_distribution_major_version' from source: facts 34589 1727204135.36465: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204135.36479: variable 'omit' from source: magic vars 34589 1727204135.36513: variable 'omit' from source: magic vars 34589 1727204135.36600: variable 'omit' from source: magic vars 34589 1727204135.36720: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204135.36816: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204135.36980: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204135.36984: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204135.36987: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204135.36990: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204135.36992: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204135.36995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204135.37194: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204135.37327: Set connection var ansible_shell_executable to /bin/sh 34589 1727204135.37331: Set connection var ansible_timeout to 10 34589 1727204135.37333: Set connection var ansible_shell_type to sh 34589 1727204135.37336: Set connection var ansible_connection to ssh 34589 1727204135.37338: Set connection var ansible_pipelining to False 34589 1727204135.37357: variable 'ansible_shell_executable' from source: unknown 34589 1727204135.37364: variable 'ansible_connection' from source: unknown 34589 1727204135.37483: variable 'ansible_module_compression' from source: unknown 34589 1727204135.37486: variable 'ansible_shell_type' from source: unknown 34589 1727204135.37488: variable 'ansible_shell_executable' from source: unknown 34589 1727204135.37490: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204135.37493: variable 'ansible_pipelining' from source: unknown 34589 1727204135.37495: variable 'ansible_timeout' from source: unknown 34589 1727204135.37497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204135.37820: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204135.38081: variable 'omit' from source: magic vars 34589 1727204135.38086: starting attempt loop 34589 1727204135.38088: running the handler 34589 1727204135.38090: variable 'ansible_facts' from source: unknown 34589 1727204135.38092: _low_level_execute_command(): starting 34589 1727204135.38094: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204135.39367: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204135.39521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204135.39577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204135.39590: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204135.39694: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204135.39812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204135.41612: stdout chunk (state=3): >>>/root <<< 34589 1727204135.41881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204135.41912: stderr chunk (state=3): >>><<< 34589 1727204135.41916: stdout chunk (state=3): >>><<< 34589 1727204135.42035: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204135.42038: _low_level_execute_command(): starting 34589 1727204135.42041: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204135.4193828-37563-138774226632169 `" && echo ansible-tmp-1727204135.4193828-37563-138774226632169="` echo /root/.ansible/tmp/ansible-tmp-1727204135.4193828-37563-138774226632169 `" ) && sleep 0' 34589 1727204135.43017: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204135.43280: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204135.43459: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204135.43578: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204135.45702: stdout chunk (state=3): >>>ansible-tmp-1727204135.4193828-37563-138774226632169=/root/.ansible/tmp/ansible-tmp-1727204135.4193828-37563-138774226632169 <<< 34589 1727204135.45857: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204135.45877: stdout chunk (state=3): >>><<< 34589 1727204135.45899: stderr chunk (state=3): >>><<< 34589 1727204135.45937: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204135.4193828-37563-138774226632169=/root/.ansible/tmp/ansible-tmp-1727204135.4193828-37563-138774226632169 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204135.45983: variable 'ansible_module_compression' from source: unknown 34589 1727204135.46047: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 34589 1727204135.46125: variable 'ansible_facts' from source: unknown 34589 1727204135.46523: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204135.4193828-37563-138774226632169/AnsiballZ_setup.py 34589 1727204135.46908: Sending initial data 34589 1727204135.46911: Sent initial data (154 bytes) 34589 1727204135.47621: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204135.48001: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204135.48205: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204135.48314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204135.50043: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 34589 1727204135.50062: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 34589 1727204135.50094: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204135.50210: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204135.50289: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmp2qxdb15d /root/.ansible/tmp/ansible-tmp-1727204135.4193828-37563-138774226632169/AnsiballZ_setup.py <<< 34589 1727204135.50340: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204135.4193828-37563-138774226632169/AnsiballZ_setup.py" <<< 34589 1727204135.50431: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmp2qxdb15d" to remote "/root/.ansible/tmp/ansible-tmp-1727204135.4193828-37563-138774226632169/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204135.4193828-37563-138774226632169/AnsiballZ_setup.py" <<< 34589 1727204135.52162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204135.52204: stderr chunk (state=3): >>><<< 34589 1727204135.52214: stdout chunk (state=3): >>><<< 34589 1727204135.52231: done transferring module to remote 34589 1727204135.52241: _low_level_execute_command(): starting 34589 1727204135.52246: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204135.4193828-37563-138774226632169/ /root/.ansible/tmp/ansible-tmp-1727204135.4193828-37563-138774226632169/AnsiballZ_setup.py && sleep 0' 34589 1727204135.52680: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204135.52687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204135.52690: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204135.52693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204135.52752: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204135.52755: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204135.52845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204135.54802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204135.54824: stderr chunk (state=3): >>><<< 34589 1727204135.54827: stdout chunk (state=3): >>><<< 34589 1727204135.54841: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204135.54848: _low_level_execute_command(): starting 34589 1727204135.54850: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204135.4193828-37563-138774226632169/AnsiballZ_setup.py && sleep 0' 34589 1727204135.55264: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204135.55268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204135.55270: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204135.55272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204135.55323: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204135.55326: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204135.55331: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204135.55427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204136.25050: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "35", "epoch": "1727204135", "epoch_int": "1727204135", "date": "2024-09-24", "time": "14:55:35", "iso8601_micro": "2024-09-24T18:55:35.846574Z", "iso8601": "2024-09-24T18:55:35Z", "iso8601_basic": "20240924T145535846574", "iso8601_basic_short": "20240924T145535", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_loadavg": {"1m": 0.64697265625, "5m": 0.53662109375, "15m": 0.28857421875}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version<<< 34589 1727204136.25086: stdout chunk (state=3): >>>_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2909, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 622, "free": 2909}, "nocache": {"free": 3267, "used": 264}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 727, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785329664, "block_size": 4096, "block_total": 65519099, "block_available": 63912434, "block_used": 1606665, "inode_total": 131070960, "inode_available": 131027260, "inode_used": 43700, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["lo", "eth0", "ethtest0", "peerethtest0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vi<<< 34589 1727204136.25128: stdout chunk (state=3): >>>f-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "16:ab:3d:8e:44:05", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::14ab:3dff:fe8e:4405", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "12:9d:30:6d:a8:93", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::109d:30ff:fe6d:a893", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5", "fe80::14ab:3dff:fe8e:4405", "fe80::109d:30ff:fe6d:a893"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5", "fe80::109d:30ff:fe6d:a893", "fe80::14ab:3dff:fe8e:4405"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 34589 1727204136.27464: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204136.27502: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 34589 1727204136.27506: stdout chunk (state=3): >>><<< 34589 1727204136.27508: stderr chunk (state=3): >>><<< 34589 1727204136.27555: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "35", "epoch": "1727204135", "epoch_int": "1727204135", "date": "2024-09-24", "time": "14:55:35", "iso8601_micro": "2024-09-24T18:55:35.846574Z", "iso8601": "2024-09-24T18:55:35Z", "iso8601_basic": "20240924T145535846574", "iso8601_basic_short": "20240924T145535", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_loadavg": {"1m": 0.64697265625, "5m": 0.53662109375, "15m": 0.28857421875}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2909, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 622, "free": 2909}, "nocache": {"free": 3267, "used": 264}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 727, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785329664, "block_size": 4096, "block_total": 65519099, "block_available": 63912434, "block_used": 1606665, "inode_total": 131070960, "inode_available": 131027260, "inode_used": 43700, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["lo", "eth0", "ethtest0", "peerethtest0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "16:ab:3d:8e:44:05", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::14ab:3dff:fe8e:4405", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "12:9d:30:6d:a8:93", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::109d:30ff:fe6d:a893", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5", "fe80::14ab:3dff:fe8e:4405", "fe80::109d:30ff:fe6d:a893"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5", "fe80::109d:30ff:fe6d:a893", "fe80::14ab:3dff:fe8e:4405"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204136.28191: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204135.4193828-37563-138774226632169/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204136.28195: _low_level_execute_command(): starting 34589 1727204136.28197: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204135.4193828-37563-138774226632169/ > /dev/null 2>&1 && sleep 0' 34589 1727204136.28774: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204136.28789: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204136.28829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34589 1727204136.28842: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 34589 1727204136.28891: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204136.28946: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204136.28959: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204136.28989: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204136.29121: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204136.31783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204136.31787: stdout chunk (state=3): >>><<< 34589 1727204136.31789: stderr chunk (state=3): >>><<< 34589 1727204136.31792: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204136.31794: handler run complete 34589 1727204136.31998: variable 'ansible_facts' from source: unknown 34589 1727204136.32184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204136.32982: variable 'ansible_facts' from source: unknown 34589 1727204136.33190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204136.33567: attempt loop complete, returning result 34589 1727204136.33787: _execute() done 34589 1727204136.33796: dumping result to json 34589 1727204136.33839: done dumping result, returning 34589 1727204136.34027: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [028d2410-947f-a9c6-cddc-0000000005ee] 34589 1727204136.34182: sending task result for task 028d2410-947f-a9c6-cddc-0000000005ee 34589 1727204136.34852: done sending task result for task 028d2410-947f-a9c6-cddc-0000000005ee 34589 1727204136.34857: WORKER PROCESS EXITING ok: [managed-node1] 34589 1727204136.35537: no more pending results, returning what we have 34589 1727204136.35540: results queue empty 34589 1727204136.35541: checking for any_errors_fatal 34589 1727204136.35542: done checking for any_errors_fatal 34589 1727204136.35542: checking for max_fail_percentage 34589 1727204136.35544: done checking for max_fail_percentage 34589 1727204136.35545: checking to see if all hosts have failed and the running result is not ok 34589 1727204136.35545: done checking to see if all hosts have failed 34589 1727204136.35546: getting the remaining hosts for this loop 34589 1727204136.35547: done getting the remaining hosts for this loop 34589 1727204136.35550: getting the next task for host managed-node1 34589 1727204136.35555: done getting next task for host managed-node1 34589 1727204136.35556: ^ task is: TASK: meta (flush_handlers) 34589 1727204136.35558: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204136.35562: getting variables 34589 1727204136.35564: in VariableManager get_vars() 34589 1727204136.35588: Calling all_inventory to load vars for managed-node1 34589 1727204136.35591: Calling groups_inventory to load vars for managed-node1 34589 1727204136.35595: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204136.35606: Calling all_plugins_play to load vars for managed-node1 34589 1727204136.35609: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204136.35612: Calling groups_plugins_play to load vars for managed-node1 34589 1727204136.38060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204136.40671: done with get_vars() 34589 1727204136.40697: done getting variables 34589 1727204136.40759: in VariableManager get_vars() 34589 1727204136.40769: Calling all_inventory to load vars for managed-node1 34589 1727204136.40771: Calling groups_inventory to load vars for managed-node1 34589 1727204136.40779: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204136.40784: Calling all_plugins_play to load vars for managed-node1 34589 1727204136.40786: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204136.40789: Calling groups_plugins_play to load vars for managed-node1 34589 1727204136.41985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204136.43682: done with get_vars() 34589 1727204136.43712: done queuing things up, now waiting for results queue to drain 34589 1727204136.43714: results queue empty 34589 1727204136.43715: checking for any_errors_fatal 34589 1727204136.43719: done checking for any_errors_fatal 34589 1727204136.43720: checking for max_fail_percentage 34589 1727204136.43725: done checking for max_fail_percentage 34589 1727204136.43726: checking to see if all hosts have failed and the running result is not ok 34589 1727204136.43727: done checking to see if all hosts have failed 34589 1727204136.43728: getting the remaining hosts for this loop 34589 1727204136.43729: done getting the remaining hosts for this loop 34589 1727204136.43731: getting the next task for host managed-node1 34589 1727204136.43735: done getting next task for host managed-node1 34589 1727204136.43738: ^ task is: TASK: Include the task 'delete_interface.yml' 34589 1727204136.43739: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204136.43741: getting variables 34589 1727204136.43742: in VariableManager get_vars() 34589 1727204136.43751: Calling all_inventory to load vars for managed-node1 34589 1727204136.43753: Calling groups_inventory to load vars for managed-node1 34589 1727204136.43755: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204136.43760: Calling all_plugins_play to load vars for managed-node1 34589 1727204136.43763: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204136.43765: Calling groups_plugins_play to load vars for managed-node1 34589 1727204136.44949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204136.46532: done with get_vars() 34589 1727204136.46552: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:83 Tuesday 24 September 2024 14:55:36 -0400 (0:00:01.128) 0:00:36.601 ***** 34589 1727204136.46633: entering _queue_task() for managed-node1/include_tasks 34589 1727204136.47206: worker is 1 (out of 1 available) 34589 1727204136.47216: exiting _queue_task() for managed-node1/include_tasks 34589 1727204136.47224: done queuing things up, now waiting for results queue to drain 34589 1727204136.47226: waiting for pending results... 34589 1727204136.47465: running TaskExecutor() for managed-node1/TASK: Include the task 'delete_interface.yml' 34589 1727204136.47471: in run() - task 028d2410-947f-a9c6-cddc-00000000009c 34589 1727204136.47477: variable 'ansible_search_path' from source: unknown 34589 1727204136.47485: calling self._execute() 34589 1727204136.47589: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204136.47602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204136.47617: variable 'omit' from source: magic vars 34589 1727204136.48007: variable 'ansible_distribution_major_version' from source: facts 34589 1727204136.48025: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204136.48036: _execute() done 34589 1727204136.48044: dumping result to json 34589 1727204136.48050: done dumping result, returning 34589 1727204136.48058: done running TaskExecutor() for managed-node1/TASK: Include the task 'delete_interface.yml' [028d2410-947f-a9c6-cddc-00000000009c] 34589 1727204136.48067: sending task result for task 028d2410-947f-a9c6-cddc-00000000009c 34589 1727204136.48310: no more pending results, returning what we have 34589 1727204136.48316: in VariableManager get_vars() 34589 1727204136.48351: Calling all_inventory to load vars for managed-node1 34589 1727204136.48353: Calling groups_inventory to load vars for managed-node1 34589 1727204136.48357: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204136.48371: Calling all_plugins_play to load vars for managed-node1 34589 1727204136.48374: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204136.48379: Calling groups_plugins_play to load vars for managed-node1 34589 1727204136.49089: done sending task result for task 028d2410-947f-a9c6-cddc-00000000009c 34589 1727204136.49092: WORKER PROCESS EXITING 34589 1727204136.50011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204136.51695: done with get_vars() 34589 1727204136.51722: variable 'ansible_search_path' from source: unknown 34589 1727204136.51739: we have included files to process 34589 1727204136.51740: generating all_blocks data 34589 1727204136.51741: done generating all_blocks data 34589 1727204136.51742: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 34589 1727204136.51743: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 34589 1727204136.51745: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 34589 1727204136.51998: done processing included file 34589 1727204136.52001: iterating over new_blocks loaded from include file 34589 1727204136.52002: in VariableManager get_vars() 34589 1727204136.52017: done with get_vars() 34589 1727204136.52019: filtering new block on tags 34589 1727204136.52042: done filtering new block on tags 34589 1727204136.52045: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed-node1 34589 1727204136.52050: extending task lists for all hosts with included blocks 34589 1727204136.52143: done extending task lists 34589 1727204136.52145: done processing included files 34589 1727204136.52145: results queue empty 34589 1727204136.52146: checking for any_errors_fatal 34589 1727204136.52148: done checking for any_errors_fatal 34589 1727204136.52149: checking for max_fail_percentage 34589 1727204136.52150: done checking for max_fail_percentage 34589 1727204136.52151: checking to see if all hosts have failed and the running result is not ok 34589 1727204136.52151: done checking to see if all hosts have failed 34589 1727204136.52152: getting the remaining hosts for this loop 34589 1727204136.52153: done getting the remaining hosts for this loop 34589 1727204136.52156: getting the next task for host managed-node1 34589 1727204136.52159: done getting next task for host managed-node1 34589 1727204136.52161: ^ task is: TASK: Remove test interface if necessary 34589 1727204136.52163: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204136.52166: getting variables 34589 1727204136.52166: in VariableManager get_vars() 34589 1727204136.52177: Calling all_inventory to load vars for managed-node1 34589 1727204136.52179: Calling groups_inventory to load vars for managed-node1 34589 1727204136.52182: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204136.52187: Calling all_plugins_play to load vars for managed-node1 34589 1727204136.52189: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204136.52192: Calling groups_plugins_play to load vars for managed-node1 34589 1727204136.53606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204136.55227: done with get_vars() 34589 1727204136.55249: done getting variables 34589 1727204136.55305: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Tuesday 24 September 2024 14:55:36 -0400 (0:00:00.087) 0:00:36.688 ***** 34589 1727204136.55337: entering _queue_task() for managed-node1/command 34589 1727204136.55706: worker is 1 (out of 1 available) 34589 1727204136.55718: exiting _queue_task() for managed-node1/command 34589 1727204136.55730: done queuing things up, now waiting for results queue to drain 34589 1727204136.55731: waiting for pending results... 34589 1727204136.55946: running TaskExecutor() for managed-node1/TASK: Remove test interface if necessary 34589 1727204136.56064: in run() - task 028d2410-947f-a9c6-cddc-0000000005ff 34589 1727204136.56090: variable 'ansible_search_path' from source: unknown 34589 1727204136.56099: variable 'ansible_search_path' from source: unknown 34589 1727204136.56141: calling self._execute() 34589 1727204136.56240: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204136.56253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204136.56267: variable 'omit' from source: magic vars 34589 1727204136.56643: variable 'ansible_distribution_major_version' from source: facts 34589 1727204136.56660: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204136.56671: variable 'omit' from source: magic vars 34589 1727204136.56719: variable 'omit' from source: magic vars 34589 1727204136.56822: variable 'interface' from source: set_fact 34589 1727204136.56844: variable 'omit' from source: magic vars 34589 1727204136.56891: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204136.56934: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204136.56960: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204136.56984: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204136.56999: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204136.57037: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204136.57046: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204136.57055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204136.57163: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204136.57177: Set connection var ansible_shell_executable to /bin/sh 34589 1727204136.57191: Set connection var ansible_timeout to 10 34589 1727204136.57199: Set connection var ansible_shell_type to sh 34589 1727204136.57214: Set connection var ansible_connection to ssh 34589 1727204136.57227: Set connection var ansible_pipelining to False 34589 1727204136.57253: variable 'ansible_shell_executable' from source: unknown 34589 1727204136.57480: variable 'ansible_connection' from source: unknown 34589 1727204136.57483: variable 'ansible_module_compression' from source: unknown 34589 1727204136.57485: variable 'ansible_shell_type' from source: unknown 34589 1727204136.57487: variable 'ansible_shell_executable' from source: unknown 34589 1727204136.57490: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204136.57493: variable 'ansible_pipelining' from source: unknown 34589 1727204136.57495: variable 'ansible_timeout' from source: unknown 34589 1727204136.57498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204136.57509: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204136.57512: variable 'omit' from source: magic vars 34589 1727204136.57514: starting attempt loop 34589 1727204136.57516: running the handler 34589 1727204136.57517: _low_level_execute_command(): starting 34589 1727204136.57520: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204136.58294: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204136.58322: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204136.58347: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204136.58365: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204136.58485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204136.60267: stdout chunk (state=3): >>>/root <<< 34589 1727204136.60364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204136.60432: stderr chunk (state=3): >>><<< 34589 1727204136.60441: stdout chunk (state=3): >>><<< 34589 1727204136.60468: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204136.60500: _low_level_execute_command(): starting 34589 1727204136.60517: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204136.6048474-37618-162188255792093 `" && echo ansible-tmp-1727204136.6048474-37618-162188255792093="` echo /root/.ansible/tmp/ansible-tmp-1727204136.6048474-37618-162188255792093 `" ) && sleep 0' 34589 1727204136.61202: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204136.61218: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204136.61348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204136.61463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204136.61549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204136.61667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204136.63836: stdout chunk (state=3): >>>ansible-tmp-1727204136.6048474-37618-162188255792093=/root/.ansible/tmp/ansible-tmp-1727204136.6048474-37618-162188255792093 <<< 34589 1727204136.64002: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204136.64006: stdout chunk (state=3): >>><<< 34589 1727204136.64011: stderr chunk (state=3): >>><<< 34589 1727204136.64055: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204136.6048474-37618-162188255792093=/root/.ansible/tmp/ansible-tmp-1727204136.6048474-37618-162188255792093 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204136.64097: variable 'ansible_module_compression' from source: unknown 34589 1727204136.64235: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34589 1727204136.64238: variable 'ansible_facts' from source: unknown 34589 1727204136.64315: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204136.6048474-37618-162188255792093/AnsiballZ_command.py 34589 1727204136.64484: Sending initial data 34589 1727204136.64617: Sent initial data (156 bytes) 34589 1727204136.65367: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204136.65431: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204136.65445: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204136.65478: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204136.65598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204136.67334: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204136.67417: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204136.67487: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmpon75_tap /root/.ansible/tmp/ansible-tmp-1727204136.6048474-37618-162188255792093/AnsiballZ_command.py <<< 34589 1727204136.67490: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204136.6048474-37618-162188255792093/AnsiballZ_command.py" <<< 34589 1727204136.67553: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmpon75_tap" to remote "/root/.ansible/tmp/ansible-tmp-1727204136.6048474-37618-162188255792093/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204136.6048474-37618-162188255792093/AnsiballZ_command.py" <<< 34589 1727204136.68427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204136.68464: stderr chunk (state=3): >>><<< 34589 1727204136.68467: stdout chunk (state=3): >>><<< 34589 1727204136.68485: done transferring module to remote 34589 1727204136.68494: _low_level_execute_command(): starting 34589 1727204136.68504: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204136.6048474-37618-162188255792093/ /root/.ansible/tmp/ansible-tmp-1727204136.6048474-37618-162188255792093/AnsiballZ_command.py && sleep 0' 34589 1727204136.68931: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204136.68934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204136.68938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204136.68940: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204136.68942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 34589 1727204136.68943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204136.68986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204136.69006: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204136.69080: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204136.71085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204136.71088: stdout chunk (state=3): >>><<< 34589 1727204136.71091: stderr chunk (state=3): >>><<< 34589 1727204136.71190: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204136.71194: _low_level_execute_command(): starting 34589 1727204136.71197: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204136.6048474-37618-162188255792093/AnsiballZ_command.py && sleep 0' 34589 1727204136.71659: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204136.71674: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204136.71780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204136.89611: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-24 14:55:36.882492", "end": "2024-09-24 14:55:36.893896", "delta": "0:00:00.011404", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34589 1727204136.92571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204136.92578: stdout chunk (state=3): >>><<< 34589 1727204136.92581: stderr chunk (state=3): >>><<< 34589 1727204136.92718: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-24 14:55:36.882492", "end": "2024-09-24 14:55:36.893896", "delta": "0:00:00.011404", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204136.92723: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204136.6048474-37618-162188255792093/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204136.92725: _low_level_execute_command(): starting 34589 1727204136.92727: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204136.6048474-37618-162188255792093/ > /dev/null 2>&1 && sleep 0' 34589 1727204136.93261: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204136.93284: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204136.93300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204136.93318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204136.93407: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204136.93435: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204136.93474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204136.93554: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204136.95718: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204136.95726: stderr chunk (state=3): >>><<< 34589 1727204136.95728: stdout chunk (state=3): >>><<< 34589 1727204136.95742: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204136.95749: handler run complete 34589 1727204136.95769: Evaluated conditional (False): False 34589 1727204136.95780: attempt loop complete, returning result 34589 1727204136.95783: _execute() done 34589 1727204136.95786: dumping result to json 34589 1727204136.95857: done dumping result, returning 34589 1727204136.95870: done running TaskExecutor() for managed-node1/TASK: Remove test interface if necessary [028d2410-947f-a9c6-cddc-0000000005ff] 34589 1727204136.95881: sending task result for task 028d2410-947f-a9c6-cddc-0000000005ff ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest0" ], "delta": "0:00:00.011404", "end": "2024-09-24 14:55:36.893896", "rc": 0, "start": "2024-09-24 14:55:36.882492" } 34589 1727204136.96067: no more pending results, returning what we have 34589 1727204136.96071: results queue empty 34589 1727204136.96072: checking for any_errors_fatal 34589 1727204136.96074: done checking for any_errors_fatal 34589 1727204136.96074: checking for max_fail_percentage 34589 1727204136.96115: done checking for max_fail_percentage 34589 1727204136.96116: checking to see if all hosts have failed and the running result is not ok 34589 1727204136.96117: done checking to see if all hosts have failed 34589 1727204136.96118: getting the remaining hosts for this loop 34589 1727204136.96119: done getting the remaining hosts for this loop 34589 1727204136.96123: getting the next task for host managed-node1 34589 1727204136.96130: done getting next task for host managed-node1 34589 1727204136.96132: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 34589 1727204136.96135: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204136.96139: getting variables 34589 1727204136.96140: in VariableManager get_vars() 34589 1727204136.96170: Calling all_inventory to load vars for managed-node1 34589 1727204136.96173: Calling groups_inventory to load vars for managed-node1 34589 1727204136.96179: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204136.96191: Calling all_plugins_play to load vars for managed-node1 34589 1727204136.96194: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204136.96197: Calling groups_plugins_play to load vars for managed-node1 34589 1727204136.96720: done sending task result for task 028d2410-947f-a9c6-cddc-0000000005ff 34589 1727204136.96724: WORKER PROCESS EXITING 34589 1727204136.97712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204136.99314: done with get_vars() 34589 1727204136.99339: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:85 Tuesday 24 September 2024 14:55:36 -0400 (0:00:00.440) 0:00:37.129 ***** 34589 1727204136.99428: entering _queue_task() for managed-node1/include_tasks 34589 1727204136.99810: worker is 1 (out of 1 available) 34589 1727204136.99822: exiting _queue_task() for managed-node1/include_tasks 34589 1727204136.99835: done queuing things up, now waiting for results queue to drain 34589 1727204136.99836: waiting for pending results... 34589 1727204137.00143: running TaskExecutor() for managed-node1/TASK: Include the task 'assert_profile_absent.yml' 34589 1727204137.00255: in run() - task 028d2410-947f-a9c6-cddc-00000000009d 34589 1727204137.00280: variable 'ansible_search_path' from source: unknown 34589 1727204137.00332: calling self._execute() 34589 1727204137.00447: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204137.00459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204137.00477: variable 'omit' from source: magic vars 34589 1727204137.00878: variable 'ansible_distribution_major_version' from source: facts 34589 1727204137.00896: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204137.00906: _execute() done 34589 1727204137.00918: dumping result to json 34589 1727204137.00926: done dumping result, returning 34589 1727204137.00935: done running TaskExecutor() for managed-node1/TASK: Include the task 'assert_profile_absent.yml' [028d2410-947f-a9c6-cddc-00000000009d] 34589 1727204137.00948: sending task result for task 028d2410-947f-a9c6-cddc-00000000009d 34589 1727204137.01205: no more pending results, returning what we have 34589 1727204137.01212: in VariableManager get_vars() 34589 1727204137.01245: Calling all_inventory to load vars for managed-node1 34589 1727204137.01247: Calling groups_inventory to load vars for managed-node1 34589 1727204137.01251: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204137.01264: Calling all_plugins_play to load vars for managed-node1 34589 1727204137.01267: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204137.01269: Calling groups_plugins_play to load vars for managed-node1 34589 1727204137.01888: done sending task result for task 028d2410-947f-a9c6-cddc-00000000009d 34589 1727204137.01892: WORKER PROCESS EXITING 34589 1727204137.02834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204137.04618: done with get_vars() 34589 1727204137.04637: variable 'ansible_search_path' from source: unknown 34589 1727204137.04652: we have included files to process 34589 1727204137.04653: generating all_blocks data 34589 1727204137.04655: done generating all_blocks data 34589 1727204137.04660: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 34589 1727204137.04661: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 34589 1727204137.04663: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 34589 1727204137.04813: in VariableManager get_vars() 34589 1727204137.04830: done with get_vars() 34589 1727204137.04935: done processing included file 34589 1727204137.04937: iterating over new_blocks loaded from include file 34589 1727204137.04939: in VariableManager get_vars() 34589 1727204137.04952: done with get_vars() 34589 1727204137.04954: filtering new block on tags 34589 1727204137.04971: done filtering new block on tags 34589 1727204137.04974: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed-node1 34589 1727204137.04981: extending task lists for all hosts with included blocks 34589 1727204137.05128: done extending task lists 34589 1727204137.05129: done processing included files 34589 1727204137.05130: results queue empty 34589 1727204137.05131: checking for any_errors_fatal 34589 1727204137.05136: done checking for any_errors_fatal 34589 1727204137.05137: checking for max_fail_percentage 34589 1727204137.05138: done checking for max_fail_percentage 34589 1727204137.05139: checking to see if all hosts have failed and the running result is not ok 34589 1727204137.05140: done checking to see if all hosts have failed 34589 1727204137.05141: getting the remaining hosts for this loop 34589 1727204137.05142: done getting the remaining hosts for this loop 34589 1727204137.05144: getting the next task for host managed-node1 34589 1727204137.05148: done getting next task for host managed-node1 34589 1727204137.05151: ^ task is: TASK: Include the task 'get_profile_stat.yml' 34589 1727204137.05153: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204137.05156: getting variables 34589 1727204137.05157: in VariableManager get_vars() 34589 1727204137.05167: Calling all_inventory to load vars for managed-node1 34589 1727204137.05169: Calling groups_inventory to load vars for managed-node1 34589 1727204137.05171: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204137.05179: Calling all_plugins_play to load vars for managed-node1 34589 1727204137.05181: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204137.05184: Calling groups_plugins_play to load vars for managed-node1 34589 1727204137.06382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204137.08077: done with get_vars() 34589 1727204137.08098: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Tuesday 24 September 2024 14:55:37 -0400 (0:00:00.087) 0:00:37.216 ***** 34589 1727204137.08156: entering _queue_task() for managed-node1/include_tasks 34589 1727204137.08432: worker is 1 (out of 1 available) 34589 1727204137.08445: exiting _queue_task() for managed-node1/include_tasks 34589 1727204137.08457: done queuing things up, now waiting for results queue to drain 34589 1727204137.08458: waiting for pending results... 34589 1727204137.08640: running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' 34589 1727204137.08721: in run() - task 028d2410-947f-a9c6-cddc-000000000612 34589 1727204137.08732: variable 'ansible_search_path' from source: unknown 34589 1727204137.08736: variable 'ansible_search_path' from source: unknown 34589 1727204137.08764: calling self._execute() 34589 1727204137.08840: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204137.08844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204137.08854: variable 'omit' from source: magic vars 34589 1727204137.09136: variable 'ansible_distribution_major_version' from source: facts 34589 1727204137.09146: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204137.09152: _execute() done 34589 1727204137.09155: dumping result to json 34589 1727204137.09158: done dumping result, returning 34589 1727204137.09164: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' [028d2410-947f-a9c6-cddc-000000000612] 34589 1727204137.09168: sending task result for task 028d2410-947f-a9c6-cddc-000000000612 34589 1727204137.09256: done sending task result for task 028d2410-947f-a9c6-cddc-000000000612 34589 1727204137.09259: WORKER PROCESS EXITING 34589 1727204137.09289: no more pending results, returning what we have 34589 1727204137.09294: in VariableManager get_vars() 34589 1727204137.09330: Calling all_inventory to load vars for managed-node1 34589 1727204137.09334: Calling groups_inventory to load vars for managed-node1 34589 1727204137.09338: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204137.09350: Calling all_plugins_play to load vars for managed-node1 34589 1727204137.09353: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204137.09356: Calling groups_plugins_play to load vars for managed-node1 34589 1727204137.10663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204137.11609: done with get_vars() 34589 1727204137.11623: variable 'ansible_search_path' from source: unknown 34589 1727204137.11623: variable 'ansible_search_path' from source: unknown 34589 1727204137.11651: we have included files to process 34589 1727204137.11652: generating all_blocks data 34589 1727204137.11653: done generating all_blocks data 34589 1727204137.11654: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 34589 1727204137.11654: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 34589 1727204137.11656: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 34589 1727204137.12315: done processing included file 34589 1727204137.12316: iterating over new_blocks loaded from include file 34589 1727204137.12317: in VariableManager get_vars() 34589 1727204137.12326: done with get_vars() 34589 1727204137.12327: filtering new block on tags 34589 1727204137.12343: done filtering new block on tags 34589 1727204137.12345: in VariableManager get_vars() 34589 1727204137.12352: done with get_vars() 34589 1727204137.12353: filtering new block on tags 34589 1727204137.12364: done filtering new block on tags 34589 1727204137.12365: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node1 34589 1727204137.12369: extending task lists for all hosts with included blocks 34589 1727204137.12431: done extending task lists 34589 1727204137.12432: done processing included files 34589 1727204137.12432: results queue empty 34589 1727204137.12433: checking for any_errors_fatal 34589 1727204137.12435: done checking for any_errors_fatal 34589 1727204137.12435: checking for max_fail_percentage 34589 1727204137.12436: done checking for max_fail_percentage 34589 1727204137.12436: checking to see if all hosts have failed and the running result is not ok 34589 1727204137.12437: done checking to see if all hosts have failed 34589 1727204137.12437: getting the remaining hosts for this loop 34589 1727204137.12438: done getting the remaining hosts for this loop 34589 1727204137.12440: getting the next task for host managed-node1 34589 1727204137.12442: done getting next task for host managed-node1 34589 1727204137.12444: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 34589 1727204137.12446: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204137.12447: getting variables 34589 1727204137.12448: in VariableManager get_vars() 34589 1727204137.12485: Calling all_inventory to load vars for managed-node1 34589 1727204137.12487: Calling groups_inventory to load vars for managed-node1 34589 1727204137.12489: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204137.12493: Calling all_plugins_play to load vars for managed-node1 34589 1727204137.12495: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204137.12498: Calling groups_plugins_play to load vars for managed-node1 34589 1727204137.13404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204137.14602: done with get_vars() 34589 1727204137.14618: done getting variables 34589 1727204137.14648: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:55:37 -0400 (0:00:00.065) 0:00:37.281 ***** 34589 1727204137.14669: entering _queue_task() for managed-node1/set_fact 34589 1727204137.14924: worker is 1 (out of 1 available) 34589 1727204137.14937: exiting _queue_task() for managed-node1/set_fact 34589 1727204137.14950: done queuing things up, now waiting for results queue to drain 34589 1727204137.14951: waiting for pending results... 34589 1727204137.15134: running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag 34589 1727204137.15211: in run() - task 028d2410-947f-a9c6-cddc-00000000062a 34589 1727204137.15226: variable 'ansible_search_path' from source: unknown 34589 1727204137.15230: variable 'ansible_search_path' from source: unknown 34589 1727204137.15259: calling self._execute() 34589 1727204137.15335: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204137.15339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204137.15348: variable 'omit' from source: magic vars 34589 1727204137.15638: variable 'ansible_distribution_major_version' from source: facts 34589 1727204137.15681: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204137.15684: variable 'omit' from source: magic vars 34589 1727204137.15727: variable 'omit' from source: magic vars 34589 1727204137.15758: variable 'omit' from source: magic vars 34589 1727204137.15826: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204137.15838: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204137.15990: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204137.15994: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204137.15996: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204137.15999: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204137.16001: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204137.16003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204137.16018: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204137.16024: Set connection var ansible_shell_executable to /bin/sh 34589 1727204137.16031: Set connection var ansible_timeout to 10 34589 1727204137.16034: Set connection var ansible_shell_type to sh 34589 1727204137.16041: Set connection var ansible_connection to ssh 34589 1727204137.16046: Set connection var ansible_pipelining to False 34589 1727204137.16067: variable 'ansible_shell_executable' from source: unknown 34589 1727204137.16070: variable 'ansible_connection' from source: unknown 34589 1727204137.16073: variable 'ansible_module_compression' from source: unknown 34589 1727204137.16077: variable 'ansible_shell_type' from source: unknown 34589 1727204137.16079: variable 'ansible_shell_executable' from source: unknown 34589 1727204137.16082: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204137.16087: variable 'ansible_pipelining' from source: unknown 34589 1727204137.16089: variable 'ansible_timeout' from source: unknown 34589 1727204137.16092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204137.16233: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204137.16242: variable 'omit' from source: magic vars 34589 1727204137.16248: starting attempt loop 34589 1727204137.16251: running the handler 34589 1727204137.16264: handler run complete 34589 1727204137.16273: attempt loop complete, returning result 34589 1727204137.16278: _execute() done 34589 1727204137.16280: dumping result to json 34589 1727204137.16283: done dumping result, returning 34589 1727204137.16340: done running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag [028d2410-947f-a9c6-cddc-00000000062a] 34589 1727204137.16344: sending task result for task 028d2410-947f-a9c6-cddc-00000000062a 34589 1727204137.16402: done sending task result for task 028d2410-947f-a9c6-cddc-00000000062a 34589 1727204137.16405: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 34589 1727204137.16495: no more pending results, returning what we have 34589 1727204137.16499: results queue empty 34589 1727204137.16500: checking for any_errors_fatal 34589 1727204137.16501: done checking for any_errors_fatal 34589 1727204137.16502: checking for max_fail_percentage 34589 1727204137.16503: done checking for max_fail_percentage 34589 1727204137.16504: checking to see if all hosts have failed and the running result is not ok 34589 1727204137.16505: done checking to see if all hosts have failed 34589 1727204137.16505: getting the remaining hosts for this loop 34589 1727204137.16509: done getting the remaining hosts for this loop 34589 1727204137.16512: getting the next task for host managed-node1 34589 1727204137.16519: done getting next task for host managed-node1 34589 1727204137.16522: ^ task is: TASK: Stat profile file 34589 1727204137.16525: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204137.16529: getting variables 34589 1727204137.16530: in VariableManager get_vars() 34589 1727204137.16557: Calling all_inventory to load vars for managed-node1 34589 1727204137.16559: Calling groups_inventory to load vars for managed-node1 34589 1727204137.16562: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204137.16571: Calling all_plugins_play to load vars for managed-node1 34589 1727204137.16573: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204137.16578: Calling groups_plugins_play to load vars for managed-node1 34589 1727204137.21581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204137.22435: done with get_vars() 34589 1727204137.22453: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:55:37 -0400 (0:00:00.078) 0:00:37.360 ***** 34589 1727204137.22514: entering _queue_task() for managed-node1/stat 34589 1727204137.22786: worker is 1 (out of 1 available) 34589 1727204137.22799: exiting _queue_task() for managed-node1/stat 34589 1727204137.22812: done queuing things up, now waiting for results queue to drain 34589 1727204137.22814: waiting for pending results... 34589 1727204137.23003: running TaskExecutor() for managed-node1/TASK: Stat profile file 34589 1727204137.23091: in run() - task 028d2410-947f-a9c6-cddc-00000000062b 34589 1727204137.23103: variable 'ansible_search_path' from source: unknown 34589 1727204137.23106: variable 'ansible_search_path' from source: unknown 34589 1727204137.23138: calling self._execute() 34589 1727204137.23221: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204137.23226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204137.23234: variable 'omit' from source: magic vars 34589 1727204137.23521: variable 'ansible_distribution_major_version' from source: facts 34589 1727204137.23531: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204137.23536: variable 'omit' from source: magic vars 34589 1727204137.23570: variable 'omit' from source: magic vars 34589 1727204137.23648: variable 'profile' from source: include params 34589 1727204137.23652: variable 'interface' from source: set_fact 34589 1727204137.23703: variable 'interface' from source: set_fact 34589 1727204137.23721: variable 'omit' from source: magic vars 34589 1727204137.23756: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204137.23784: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204137.23801: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204137.23819: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204137.23829: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204137.23852: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204137.23855: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204137.23857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204137.23932: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204137.23935: Set connection var ansible_shell_executable to /bin/sh 34589 1727204137.23942: Set connection var ansible_timeout to 10 34589 1727204137.23945: Set connection var ansible_shell_type to sh 34589 1727204137.23952: Set connection var ansible_connection to ssh 34589 1727204137.23957: Set connection var ansible_pipelining to False 34589 1727204137.23974: variable 'ansible_shell_executable' from source: unknown 34589 1727204137.23978: variable 'ansible_connection' from source: unknown 34589 1727204137.23981: variable 'ansible_module_compression' from source: unknown 34589 1727204137.23983: variable 'ansible_shell_type' from source: unknown 34589 1727204137.23985: variable 'ansible_shell_executable' from source: unknown 34589 1727204137.23988: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204137.23991: variable 'ansible_pipelining' from source: unknown 34589 1727204137.23994: variable 'ansible_timeout' from source: unknown 34589 1727204137.23998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204137.24145: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34589 1727204137.24154: variable 'omit' from source: magic vars 34589 1727204137.24160: starting attempt loop 34589 1727204137.24163: running the handler 34589 1727204137.24174: _low_level_execute_command(): starting 34589 1727204137.24182: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204137.24711: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204137.24716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204137.24721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204137.24766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204137.24769: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204137.24779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204137.24861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204137.26640: stdout chunk (state=3): >>>/root <<< 34589 1727204137.26734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204137.26766: stderr chunk (state=3): >>><<< 34589 1727204137.26769: stdout chunk (state=3): >>><<< 34589 1727204137.26792: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204137.26805: _low_level_execute_command(): starting 34589 1727204137.26815: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204137.2679248-37646-257050332372474 `" && echo ansible-tmp-1727204137.2679248-37646-257050332372474="` echo /root/.ansible/tmp/ansible-tmp-1727204137.2679248-37646-257050332372474 `" ) && sleep 0' 34589 1727204137.27260: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204137.27263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204137.27273: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204137.27278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204137.27327: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204137.27330: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204137.27335: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204137.27415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204137.29513: stdout chunk (state=3): >>>ansible-tmp-1727204137.2679248-37646-257050332372474=/root/.ansible/tmp/ansible-tmp-1727204137.2679248-37646-257050332372474 <<< 34589 1727204137.29618: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204137.29650: stderr chunk (state=3): >>><<< 34589 1727204137.29657: stdout chunk (state=3): >>><<< 34589 1727204137.29669: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204137.2679248-37646-257050332372474=/root/.ansible/tmp/ansible-tmp-1727204137.2679248-37646-257050332372474 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204137.29709: variable 'ansible_module_compression' from source: unknown 34589 1727204137.29753: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 34589 1727204137.29790: variable 'ansible_facts' from source: unknown 34589 1727204137.29839: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204137.2679248-37646-257050332372474/AnsiballZ_stat.py 34589 1727204137.29939: Sending initial data 34589 1727204137.29943: Sent initial data (153 bytes) 34589 1727204137.30392: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204137.30395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204137.30397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204137.30400: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204137.30402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204137.30457: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204137.30460: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204137.30465: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204137.30544: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204137.32272: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 34589 1727204137.32280: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204137.32345: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204137.32421: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmpd_s8nrtr /root/.ansible/tmp/ansible-tmp-1727204137.2679248-37646-257050332372474/AnsiballZ_stat.py <<< 34589 1727204137.32429: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204137.2679248-37646-257050332372474/AnsiballZ_stat.py" <<< 34589 1727204137.32498: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmpd_s8nrtr" to remote "/root/.ansible/tmp/ansible-tmp-1727204137.2679248-37646-257050332372474/AnsiballZ_stat.py" <<< 34589 1727204137.32501: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204137.2679248-37646-257050332372474/AnsiballZ_stat.py" <<< 34589 1727204137.33160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204137.33201: stderr chunk (state=3): >>><<< 34589 1727204137.33205: stdout chunk (state=3): >>><<< 34589 1727204137.33232: done transferring module to remote 34589 1727204137.33241: _low_level_execute_command(): starting 34589 1727204137.33246: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204137.2679248-37646-257050332372474/ /root/.ansible/tmp/ansible-tmp-1727204137.2679248-37646-257050332372474/AnsiballZ_stat.py && sleep 0' 34589 1727204137.33697: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204137.33700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204137.33703: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204137.33709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 34589 1727204137.33711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204137.33757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204137.33760: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204137.33766: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204137.33840: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204137.35804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204137.35837: stderr chunk (state=3): >>><<< 34589 1727204137.35841: stdout chunk (state=3): >>><<< 34589 1727204137.35849: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204137.35852: _low_level_execute_command(): starting 34589 1727204137.35858: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204137.2679248-37646-257050332372474/AnsiballZ_stat.py && sleep 0' 34589 1727204137.36309: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204137.36313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204137.36315: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204137.36318: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204137.36320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204137.36371: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204137.36381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204137.36384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204137.36462: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204137.53271: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 34589 1727204137.54910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204137.54914: stdout chunk (state=3): >>><<< 34589 1727204137.54917: stderr chunk (state=3): >>><<< 34589 1727204137.54932: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204137.54983: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204137.2679248-37646-257050332372474/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204137.54988: _low_level_execute_command(): starting 34589 1727204137.54990: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204137.2679248-37646-257050332372474/ > /dev/null 2>&1 && sleep 0' 34589 1727204137.55653: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204137.55670: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204137.55696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204137.55715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204137.55740: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204137.55754: stderr chunk (state=3): >>>debug2: match not found <<< 34589 1727204137.55793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204137.55879: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204137.55899: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204137.55925: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204137.56039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204137.58077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204137.58081: stdout chunk (state=3): >>><<< 34589 1727204137.58282: stderr chunk (state=3): >>><<< 34589 1727204137.58285: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204137.58288: handler run complete 34589 1727204137.58290: attempt loop complete, returning result 34589 1727204137.58292: _execute() done 34589 1727204137.58294: dumping result to json 34589 1727204137.58296: done dumping result, returning 34589 1727204137.58297: done running TaskExecutor() for managed-node1/TASK: Stat profile file [028d2410-947f-a9c6-cddc-00000000062b] 34589 1727204137.58299: sending task result for task 028d2410-947f-a9c6-cddc-00000000062b 34589 1727204137.58366: done sending task result for task 028d2410-947f-a9c6-cddc-00000000062b 34589 1727204137.58369: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 34589 1727204137.58535: no more pending results, returning what we have 34589 1727204137.58539: results queue empty 34589 1727204137.58540: checking for any_errors_fatal 34589 1727204137.58545: done checking for any_errors_fatal 34589 1727204137.58545: checking for max_fail_percentage 34589 1727204137.58547: done checking for max_fail_percentage 34589 1727204137.58548: checking to see if all hosts have failed and the running result is not ok 34589 1727204137.58549: done checking to see if all hosts have failed 34589 1727204137.58549: getting the remaining hosts for this loop 34589 1727204137.58551: done getting the remaining hosts for this loop 34589 1727204137.58554: getting the next task for host managed-node1 34589 1727204137.58560: done getting next task for host managed-node1 34589 1727204137.58562: ^ task is: TASK: Set NM profile exist flag based on the profile files 34589 1727204137.58566: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204137.58569: getting variables 34589 1727204137.58571: in VariableManager get_vars() 34589 1727204137.58605: Calling all_inventory to load vars for managed-node1 34589 1727204137.58608: Calling groups_inventory to load vars for managed-node1 34589 1727204137.58611: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204137.58621: Calling all_plugins_play to load vars for managed-node1 34589 1727204137.58624: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204137.58627: Calling groups_plugins_play to load vars for managed-node1 34589 1727204137.60092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204137.61715: done with get_vars() 34589 1727204137.61750: done getting variables 34589 1727204137.61815: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:55:37 -0400 (0:00:00.393) 0:00:37.753 ***** 34589 1727204137.61853: entering _queue_task() for managed-node1/set_fact 34589 1727204137.62234: worker is 1 (out of 1 available) 34589 1727204137.62245: exiting _queue_task() for managed-node1/set_fact 34589 1727204137.62257: done queuing things up, now waiting for results queue to drain 34589 1727204137.62258: waiting for pending results... 34589 1727204137.62550: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files 34589 1727204137.62674: in run() - task 028d2410-947f-a9c6-cddc-00000000062c 34589 1727204137.62691: variable 'ansible_search_path' from source: unknown 34589 1727204137.62695: variable 'ansible_search_path' from source: unknown 34589 1727204137.62740: calling self._execute() 34589 1727204137.62841: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204137.62845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204137.62855: variable 'omit' from source: magic vars 34589 1727204137.63238: variable 'ansible_distribution_major_version' from source: facts 34589 1727204137.63257: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204137.63380: variable 'profile_stat' from source: set_fact 34589 1727204137.63393: Evaluated conditional (profile_stat.stat.exists): False 34589 1727204137.63397: when evaluation is False, skipping this task 34589 1727204137.63400: _execute() done 34589 1727204137.63403: dumping result to json 34589 1727204137.63406: done dumping result, returning 34589 1727204137.63411: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files [028d2410-947f-a9c6-cddc-00000000062c] 34589 1727204137.63414: sending task result for task 028d2410-947f-a9c6-cddc-00000000062c 34589 1727204137.63641: done sending task result for task 028d2410-947f-a9c6-cddc-00000000062c 34589 1727204137.63645: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 34589 1727204137.63718: no more pending results, returning what we have 34589 1727204137.63721: results queue empty 34589 1727204137.63722: checking for any_errors_fatal 34589 1727204137.63730: done checking for any_errors_fatal 34589 1727204137.63730: checking for max_fail_percentage 34589 1727204137.63732: done checking for max_fail_percentage 34589 1727204137.63733: checking to see if all hosts have failed and the running result is not ok 34589 1727204137.63734: done checking to see if all hosts have failed 34589 1727204137.63734: getting the remaining hosts for this loop 34589 1727204137.63735: done getting the remaining hosts for this loop 34589 1727204137.63739: getting the next task for host managed-node1 34589 1727204137.63745: done getting next task for host managed-node1 34589 1727204137.63748: ^ task is: TASK: Get NM profile info 34589 1727204137.63752: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204137.63756: getting variables 34589 1727204137.63757: in VariableManager get_vars() 34589 1727204137.63884: Calling all_inventory to load vars for managed-node1 34589 1727204137.63887: Calling groups_inventory to load vars for managed-node1 34589 1727204137.63890: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204137.63905: Calling all_plugins_play to load vars for managed-node1 34589 1727204137.63908: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204137.63912: Calling groups_plugins_play to load vars for managed-node1 34589 1727204137.65381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204137.66968: done with get_vars() 34589 1727204137.66992: done getting variables 34589 1727204137.67052: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:55:37 -0400 (0:00:00.052) 0:00:37.805 ***** 34589 1727204137.67085: entering _queue_task() for managed-node1/shell 34589 1727204137.67424: worker is 1 (out of 1 available) 34589 1727204137.67438: exiting _queue_task() for managed-node1/shell 34589 1727204137.67451: done queuing things up, now waiting for results queue to drain 34589 1727204137.67452: waiting for pending results... 34589 1727204137.67752: running TaskExecutor() for managed-node1/TASK: Get NM profile info 34589 1727204137.67872: in run() - task 028d2410-947f-a9c6-cddc-00000000062d 34589 1727204137.67891: variable 'ansible_search_path' from source: unknown 34589 1727204137.67894: variable 'ansible_search_path' from source: unknown 34589 1727204137.67937: calling self._execute() 34589 1727204137.68081: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204137.68085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204137.68089: variable 'omit' from source: magic vars 34589 1727204137.68425: variable 'ansible_distribution_major_version' from source: facts 34589 1727204137.68435: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204137.68449: variable 'omit' from source: magic vars 34589 1727204137.68495: variable 'omit' from source: magic vars 34589 1727204137.68598: variable 'profile' from source: include params 34589 1727204137.68602: variable 'interface' from source: set_fact 34589 1727204137.68880: variable 'interface' from source: set_fact 34589 1727204137.68885: variable 'omit' from source: magic vars 34589 1727204137.68888: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204137.68895: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204137.68898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204137.68901: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204137.68903: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204137.68906: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204137.68911: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204137.68914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204137.68962: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204137.68967: Set connection var ansible_shell_executable to /bin/sh 34589 1727204137.68977: Set connection var ansible_timeout to 10 34589 1727204137.68980: Set connection var ansible_shell_type to sh 34589 1727204137.68991: Set connection var ansible_connection to ssh 34589 1727204137.68997: Set connection var ansible_pipelining to False 34589 1727204137.69023: variable 'ansible_shell_executable' from source: unknown 34589 1727204137.69026: variable 'ansible_connection' from source: unknown 34589 1727204137.69029: variable 'ansible_module_compression' from source: unknown 34589 1727204137.69032: variable 'ansible_shell_type' from source: unknown 34589 1727204137.69034: variable 'ansible_shell_executable' from source: unknown 34589 1727204137.69037: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204137.69040: variable 'ansible_pipelining' from source: unknown 34589 1727204137.69042: variable 'ansible_timeout' from source: unknown 34589 1727204137.69044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204137.69181: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204137.69192: variable 'omit' from source: magic vars 34589 1727204137.69198: starting attempt loop 34589 1727204137.69200: running the handler 34589 1727204137.69215: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204137.69240: _low_level_execute_command(): starting 34589 1727204137.69247: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204137.70079: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204137.70105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204137.70115: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204137.70225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204137.72053: stdout chunk (state=3): >>>/root <<< 34589 1727204137.72341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204137.72345: stdout chunk (state=3): >>><<< 34589 1727204137.72348: stderr chunk (state=3): >>><<< 34589 1727204137.72373: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204137.72390: _low_level_execute_command(): starting 34589 1727204137.72396: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204137.7237291-37669-38748522016230 `" && echo ansible-tmp-1727204137.7237291-37669-38748522016230="` echo /root/.ansible/tmp/ansible-tmp-1727204137.7237291-37669-38748522016230 `" ) && sleep 0' 34589 1727204137.73088: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204137.73099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204137.73114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204137.73123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204137.73135: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204137.73142: stderr chunk (state=3): >>>debug2: match not found <<< 34589 1727204137.73151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204137.73165: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34589 1727204137.73173: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 34589 1727204137.73183: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 34589 1727204137.73191: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204137.73200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204137.73212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204137.73219: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204137.73228: stderr chunk (state=3): >>>debug2: match found <<< 34589 1727204137.73289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204137.73304: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204137.73316: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204137.73336: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204137.73440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204137.75581: stdout chunk (state=3): >>>ansible-tmp-1727204137.7237291-37669-38748522016230=/root/.ansible/tmp/ansible-tmp-1727204137.7237291-37669-38748522016230 <<< 34589 1727204137.75981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204137.75986: stdout chunk (state=3): >>><<< 34589 1727204137.75989: stderr chunk (state=3): >>><<< 34589 1727204137.75992: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204137.7237291-37669-38748522016230=/root/.ansible/tmp/ansible-tmp-1727204137.7237291-37669-38748522016230 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204137.75995: variable 'ansible_module_compression' from source: unknown 34589 1727204137.75997: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34589 1727204137.76000: variable 'ansible_facts' from source: unknown 34589 1727204137.76024: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204137.7237291-37669-38748522016230/AnsiballZ_command.py 34589 1727204137.76232: Sending initial data 34589 1727204137.76236: Sent initial data (155 bytes) 34589 1727204137.76972: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204137.77098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204137.77281: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204137.79160: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 34589 1727204137.79169: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 34589 1727204137.79174: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 34589 1727204137.79184: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 34589 1727204137.79191: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 34589 1727204137.79198: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 34589 1727204137.79205: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 34589 1727204137.79217: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204137.79317: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204137.79432: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmpb3lme797 /root/.ansible/tmp/ansible-tmp-1727204137.7237291-37669-38748522016230/AnsiballZ_command.py <<< 34589 1727204137.79436: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204137.7237291-37669-38748522016230/AnsiballZ_command.py" <<< 34589 1727204137.79528: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmpb3lme797" to remote "/root/.ansible/tmp/ansible-tmp-1727204137.7237291-37669-38748522016230/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204137.7237291-37669-38748522016230/AnsiballZ_command.py" <<< 34589 1727204137.80681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204137.80684: stdout chunk (state=3): >>><<< 34589 1727204137.80687: stderr chunk (state=3): >>><<< 34589 1727204137.80690: done transferring module to remote 34589 1727204137.80693: _low_level_execute_command(): starting 34589 1727204137.80696: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204137.7237291-37669-38748522016230/ /root/.ansible/tmp/ansible-tmp-1727204137.7237291-37669-38748522016230/AnsiballZ_command.py && sleep 0' 34589 1727204137.81292: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204137.81331: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204137.81343: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204137.81365: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204137.81479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204137.83467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204137.83511: stderr chunk (state=3): >>><<< 34589 1727204137.83515: stdout chunk (state=3): >>><<< 34589 1727204137.83535: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204137.83539: _low_level_execute_command(): starting 34589 1727204137.83542: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204137.7237291-37669-38748522016230/AnsiballZ_command.py && sleep 0' 34589 1727204137.84096: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204137.84181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204137.84185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204137.84187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204137.84191: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204137.84193: stderr chunk (state=3): >>>debug2: match not found <<< 34589 1727204137.84196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204137.84198: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34589 1727204137.84201: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 34589 1727204137.84203: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 34589 1727204137.84205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204137.84207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204137.84209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204137.84277: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204137.84308: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204137.84312: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204137.84406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204138.02833: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-24 14:55:38.009396", "end": "2024-09-24 14:55:38.026418", "delta": "0:00:00.017022", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34589 1727204138.04584: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.14.47 closed. <<< 34589 1727204138.04611: stderr chunk (state=3): >>><<< 34589 1727204138.04615: stdout chunk (state=3): >>><<< 34589 1727204138.04635: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-24 14:55:38.009396", "end": "2024-09-24 14:55:38.026418", "delta": "0:00:00.017022", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.14.47 closed. 34589 1727204138.04665: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204137.7237291-37669-38748522016230/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204138.04697: _low_level_execute_command(): starting 34589 1727204138.04703: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204137.7237291-37669-38748522016230/ > /dev/null 2>&1 && sleep 0' 34589 1727204138.05298: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204138.05302: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204138.05304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204138.05384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204138.05388: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204138.05390: stderr chunk (state=3): >>>debug2: match not found <<< 34589 1727204138.05392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204138.05397: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34589 1727204138.05401: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 34589 1727204138.05403: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 34589 1727204138.05405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204138.05407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204138.05409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204138.05411: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204138.05413: stderr chunk (state=3): >>>debug2: match found <<< 34589 1727204138.05415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204138.05495: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204138.05498: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204138.05525: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204138.05651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204138.07591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204138.07618: stderr chunk (state=3): >>><<< 34589 1727204138.07621: stdout chunk (state=3): >>><<< 34589 1727204138.07651: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204138.07654: handler run complete 34589 1727204138.07669: Evaluated conditional (False): False 34589 1727204138.07679: attempt loop complete, returning result 34589 1727204138.07683: _execute() done 34589 1727204138.07686: dumping result to json 34589 1727204138.07695: done dumping result, returning 34589 1727204138.07698: done running TaskExecutor() for managed-node1/TASK: Get NM profile info [028d2410-947f-a9c6-cddc-00000000062d] 34589 1727204138.07701: sending task result for task 028d2410-947f-a9c6-cddc-00000000062d 34589 1727204138.07837: done sending task result for task 028d2410-947f-a9c6-cddc-00000000062d 34589 1727204138.07840: WORKER PROCESS EXITING fatal: [managed-node1]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "delta": "0:00:00.017022", "end": "2024-09-24 14:55:38.026418", "rc": 1, "start": "2024-09-24 14:55:38.009396" } MSG: non-zero return code ...ignoring 34589 1727204138.07960: no more pending results, returning what we have 34589 1727204138.07964: results queue empty 34589 1727204138.07964: checking for any_errors_fatal 34589 1727204138.07969: done checking for any_errors_fatal 34589 1727204138.07970: checking for max_fail_percentage 34589 1727204138.07971: done checking for max_fail_percentage 34589 1727204138.07972: checking to see if all hosts have failed and the running result is not ok 34589 1727204138.07973: done checking to see if all hosts have failed 34589 1727204138.07974: getting the remaining hosts for this loop 34589 1727204138.08103: done getting the remaining hosts for this loop 34589 1727204138.08111: getting the next task for host managed-node1 34589 1727204138.08117: done getting next task for host managed-node1 34589 1727204138.08121: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 34589 1727204138.08124: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204138.08128: getting variables 34589 1727204138.08130: in VariableManager get_vars() 34589 1727204138.08155: Calling all_inventory to load vars for managed-node1 34589 1727204138.08158: Calling groups_inventory to load vars for managed-node1 34589 1727204138.08161: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204138.08170: Calling all_plugins_play to load vars for managed-node1 34589 1727204138.08172: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204138.08175: Calling groups_plugins_play to load vars for managed-node1 34589 1727204138.09760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204138.10956: done with get_vars() 34589 1727204138.10972: done getting variables 34589 1727204138.11017: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:55:38 -0400 (0:00:00.439) 0:00:38.245 ***** 34589 1727204138.11044: entering _queue_task() for managed-node1/set_fact 34589 1727204138.11326: worker is 1 (out of 1 available) 34589 1727204138.11375: exiting _queue_task() for managed-node1/set_fact 34589 1727204138.11389: done queuing things up, now waiting for results queue to drain 34589 1727204138.11391: waiting for pending results... 34589 1727204138.11559: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 34589 1727204138.11765: in run() - task 028d2410-947f-a9c6-cddc-00000000062e 34589 1727204138.11769: variable 'ansible_search_path' from source: unknown 34589 1727204138.11772: variable 'ansible_search_path' from source: unknown 34589 1727204138.11778: calling self._execute() 34589 1727204138.11815: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204138.11819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204138.11839: variable 'omit' from source: magic vars 34589 1727204138.12285: variable 'ansible_distribution_major_version' from source: facts 34589 1727204138.12289: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204138.12665: variable 'nm_profile_exists' from source: set_fact 34589 1727204138.12679: Evaluated conditional (nm_profile_exists.rc == 0): False 34589 1727204138.12683: when evaluation is False, skipping this task 34589 1727204138.12685: _execute() done 34589 1727204138.12687: dumping result to json 34589 1727204138.12690: done dumping result, returning 34589 1727204138.12721: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [028d2410-947f-a9c6-cddc-00000000062e] 34589 1727204138.12724: sending task result for task 028d2410-947f-a9c6-cddc-00000000062e 34589 1727204138.12792: done sending task result for task 028d2410-947f-a9c6-cddc-00000000062e 34589 1727204138.12795: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 34589 1727204138.12872: no more pending results, returning what we have 34589 1727204138.12880: results queue empty 34589 1727204138.12881: checking for any_errors_fatal 34589 1727204138.12892: done checking for any_errors_fatal 34589 1727204138.12893: checking for max_fail_percentage 34589 1727204138.12894: done checking for max_fail_percentage 34589 1727204138.12895: checking to see if all hosts have failed and the running result is not ok 34589 1727204138.12897: done checking to see if all hosts have failed 34589 1727204138.12897: getting the remaining hosts for this loop 34589 1727204138.12899: done getting the remaining hosts for this loop 34589 1727204138.12902: getting the next task for host managed-node1 34589 1727204138.12912: done getting next task for host managed-node1 34589 1727204138.12915: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 34589 1727204138.12919: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204138.12923: getting variables 34589 1727204138.12925: in VariableManager get_vars() 34589 1727204138.13068: Calling all_inventory to load vars for managed-node1 34589 1727204138.13071: Calling groups_inventory to load vars for managed-node1 34589 1727204138.13239: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204138.13250: Calling all_plugins_play to load vars for managed-node1 34589 1727204138.13253: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204138.13256: Calling groups_plugins_play to load vars for managed-node1 34589 1727204138.15487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204138.17167: done with get_vars() 34589 1727204138.17195: done getting variables 34589 1727204138.17262: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34589 1727204138.17390: variable 'profile' from source: include params 34589 1727204138.17394: variable 'interface' from source: set_fact 34589 1727204138.17457: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-ethtest0] *********************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:55:38 -0400 (0:00:00.064) 0:00:38.309 ***** 34589 1727204138.17496: entering _queue_task() for managed-node1/command 34589 1727204138.17857: worker is 1 (out of 1 available) 34589 1727204138.17869: exiting _queue_task() for managed-node1/command 34589 1727204138.17883: done queuing things up, now waiting for results queue to drain 34589 1727204138.17885: waiting for pending results... 34589 1727204138.18290: running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-ethtest0 34589 1727204138.18374: in run() - task 028d2410-947f-a9c6-cddc-000000000630 34589 1727204138.18434: variable 'ansible_search_path' from source: unknown 34589 1727204138.18589: variable 'ansible_search_path' from source: unknown 34589 1727204138.18593: calling self._execute() 34589 1727204138.18791: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204138.18810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204138.18825: variable 'omit' from source: magic vars 34589 1727204138.19777: variable 'ansible_distribution_major_version' from source: facts 34589 1727204138.19800: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204138.19945: variable 'profile_stat' from source: set_fact 34589 1727204138.19965: Evaluated conditional (profile_stat.stat.exists): False 34589 1727204138.19972: when evaluation is False, skipping this task 34589 1727204138.19986: _execute() done 34589 1727204138.19993: dumping result to json 34589 1727204138.20001: done dumping result, returning 34589 1727204138.20015: done running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-ethtest0 [028d2410-947f-a9c6-cddc-000000000630] 34589 1727204138.20039: sending task result for task 028d2410-947f-a9c6-cddc-000000000630 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 34589 1727204138.20331: no more pending results, returning what we have 34589 1727204138.20335: results queue empty 34589 1727204138.20336: checking for any_errors_fatal 34589 1727204138.20344: done checking for any_errors_fatal 34589 1727204138.20344: checking for max_fail_percentage 34589 1727204138.20346: done checking for max_fail_percentage 34589 1727204138.20347: checking to see if all hosts have failed and the running result is not ok 34589 1727204138.20348: done checking to see if all hosts have failed 34589 1727204138.20349: getting the remaining hosts for this loop 34589 1727204138.20350: done getting the remaining hosts for this loop 34589 1727204138.20354: getting the next task for host managed-node1 34589 1727204138.20361: done getting next task for host managed-node1 34589 1727204138.20363: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 34589 1727204138.20368: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204138.20373: getting variables 34589 1727204138.20374: in VariableManager get_vars() 34589 1727204138.20414: Calling all_inventory to load vars for managed-node1 34589 1727204138.20418: Calling groups_inventory to load vars for managed-node1 34589 1727204138.20422: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204138.20436: Calling all_plugins_play to load vars for managed-node1 34589 1727204138.20440: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204138.20443: Calling groups_plugins_play to load vars for managed-node1 34589 1727204138.20995: done sending task result for task 028d2410-947f-a9c6-cddc-000000000630 34589 1727204138.20999: WORKER PROCESS EXITING 34589 1727204138.22279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204138.23898: done with get_vars() 34589 1727204138.23924: done getting variables 34589 1727204138.23993: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34589 1727204138.24118: variable 'profile' from source: include params 34589 1727204138.24122: variable 'interface' from source: set_fact 34589 1727204138.24189: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-ethtest0] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:55:38 -0400 (0:00:00.067) 0:00:38.377 ***** 34589 1727204138.24224: entering _queue_task() for managed-node1/set_fact 34589 1727204138.24581: worker is 1 (out of 1 available) 34589 1727204138.24709: exiting _queue_task() for managed-node1/set_fact 34589 1727204138.24720: done queuing things up, now waiting for results queue to drain 34589 1727204138.24721: waiting for pending results... 34589 1727204138.24905: running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 34589 1727204138.25051: in run() - task 028d2410-947f-a9c6-cddc-000000000631 34589 1727204138.25071: variable 'ansible_search_path' from source: unknown 34589 1727204138.25080: variable 'ansible_search_path' from source: unknown 34589 1727204138.25124: calling self._execute() 34589 1727204138.25231: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204138.25242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204138.25259: variable 'omit' from source: magic vars 34589 1727204138.25684: variable 'ansible_distribution_major_version' from source: facts 34589 1727204138.25688: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204138.25795: variable 'profile_stat' from source: set_fact 34589 1727204138.25819: Evaluated conditional (profile_stat.stat.exists): False 34589 1727204138.25826: when evaluation is False, skipping this task 34589 1727204138.25833: _execute() done 34589 1727204138.25839: dumping result to json 34589 1727204138.25845: done dumping result, returning 34589 1727204138.25901: done running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 [028d2410-947f-a9c6-cddc-000000000631] 34589 1727204138.25905: sending task result for task 028d2410-947f-a9c6-cddc-000000000631 34589 1727204138.25970: done sending task result for task 028d2410-947f-a9c6-cddc-000000000631 34589 1727204138.25973: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 34589 1727204138.26099: no more pending results, returning what we have 34589 1727204138.26103: results queue empty 34589 1727204138.26104: checking for any_errors_fatal 34589 1727204138.26117: done checking for any_errors_fatal 34589 1727204138.26118: checking for max_fail_percentage 34589 1727204138.26120: done checking for max_fail_percentage 34589 1727204138.26120: checking to see if all hosts have failed and the running result is not ok 34589 1727204138.26122: done checking to see if all hosts have failed 34589 1727204138.26122: getting the remaining hosts for this loop 34589 1727204138.26124: done getting the remaining hosts for this loop 34589 1727204138.26127: getting the next task for host managed-node1 34589 1727204138.26134: done getting next task for host managed-node1 34589 1727204138.26137: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 34589 1727204138.26141: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204138.26145: getting variables 34589 1727204138.26147: in VariableManager get_vars() 34589 1727204138.26177: Calling all_inventory to load vars for managed-node1 34589 1727204138.26180: Calling groups_inventory to load vars for managed-node1 34589 1727204138.26184: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204138.26198: Calling all_plugins_play to load vars for managed-node1 34589 1727204138.26201: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204138.26204: Calling groups_plugins_play to load vars for managed-node1 34589 1727204138.27774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204138.29452: done with get_vars() 34589 1727204138.29477: done getting variables 34589 1727204138.29545: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34589 1727204138.29656: variable 'profile' from source: include params 34589 1727204138.29660: variable 'interface' from source: set_fact 34589 1727204138.29722: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-ethtest0] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:55:38 -0400 (0:00:00.055) 0:00:38.432 ***** 34589 1727204138.29757: entering _queue_task() for managed-node1/command 34589 1727204138.30295: worker is 1 (out of 1 available) 34589 1727204138.30309: exiting _queue_task() for managed-node1/command 34589 1727204138.30319: done queuing things up, now waiting for results queue to drain 34589 1727204138.30320: waiting for pending results... 34589 1727204138.30558: running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-ethtest0 34589 1727204138.30563: in run() - task 028d2410-947f-a9c6-cddc-000000000632 34589 1727204138.30567: variable 'ansible_search_path' from source: unknown 34589 1727204138.30569: variable 'ansible_search_path' from source: unknown 34589 1727204138.30599: calling self._execute() 34589 1727204138.30717: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204138.30728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204138.30741: variable 'omit' from source: magic vars 34589 1727204138.31121: variable 'ansible_distribution_major_version' from source: facts 34589 1727204138.31137: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204138.31272: variable 'profile_stat' from source: set_fact 34589 1727204138.31295: Evaluated conditional (profile_stat.stat.exists): False 34589 1727204138.31315: when evaluation is False, skipping this task 34589 1727204138.31324: _execute() done 34589 1727204138.31331: dumping result to json 34589 1727204138.31380: done dumping result, returning 34589 1727204138.31384: done running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-ethtest0 [028d2410-947f-a9c6-cddc-000000000632] 34589 1727204138.31387: sending task result for task 028d2410-947f-a9c6-cddc-000000000632 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 34589 1727204138.31574: no more pending results, returning what we have 34589 1727204138.31579: results queue empty 34589 1727204138.31581: checking for any_errors_fatal 34589 1727204138.31586: done checking for any_errors_fatal 34589 1727204138.31587: checking for max_fail_percentage 34589 1727204138.31589: done checking for max_fail_percentage 34589 1727204138.31590: checking to see if all hosts have failed and the running result is not ok 34589 1727204138.31590: done checking to see if all hosts have failed 34589 1727204138.31591: getting the remaining hosts for this loop 34589 1727204138.31593: done getting the remaining hosts for this loop 34589 1727204138.31596: getting the next task for host managed-node1 34589 1727204138.31604: done getting next task for host managed-node1 34589 1727204138.31609: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 34589 1727204138.31614: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204138.31618: getting variables 34589 1727204138.31620: in VariableManager get_vars() 34589 1727204138.31654: Calling all_inventory to load vars for managed-node1 34589 1727204138.31657: Calling groups_inventory to load vars for managed-node1 34589 1727204138.31661: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204138.31675: Calling all_plugins_play to load vars for managed-node1 34589 1727204138.31680: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204138.31683: Calling groups_plugins_play to load vars for managed-node1 34589 1727204138.32263: done sending task result for task 028d2410-947f-a9c6-cddc-000000000632 34589 1727204138.32267: WORKER PROCESS EXITING 34589 1727204138.33424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204138.35089: done with get_vars() 34589 1727204138.35121: done getting variables 34589 1727204138.35186: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34589 1727204138.35311: variable 'profile' from source: include params 34589 1727204138.35315: variable 'interface' from source: set_fact 34589 1727204138.35379: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-ethtest0] ************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:55:38 -0400 (0:00:00.056) 0:00:38.489 ***** 34589 1727204138.35414: entering _queue_task() for managed-node1/set_fact 34589 1727204138.35832: worker is 1 (out of 1 available) 34589 1727204138.35843: exiting _queue_task() for managed-node1/set_fact 34589 1727204138.35855: done queuing things up, now waiting for results queue to drain 34589 1727204138.35856: waiting for pending results... 34589 1727204138.36100: running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-ethtest0 34589 1727204138.36246: in run() - task 028d2410-947f-a9c6-cddc-000000000633 34589 1727204138.36286: variable 'ansible_search_path' from source: unknown 34589 1727204138.36289: variable 'ansible_search_path' from source: unknown 34589 1727204138.36308: calling self._execute() 34589 1727204138.36388: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204138.36398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204138.36413: variable 'omit' from source: magic vars 34589 1727204138.36685: variable 'ansible_distribution_major_version' from source: facts 34589 1727204138.36694: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204138.36774: variable 'profile_stat' from source: set_fact 34589 1727204138.36787: Evaluated conditional (profile_stat.stat.exists): False 34589 1727204138.36790: when evaluation is False, skipping this task 34589 1727204138.36793: _execute() done 34589 1727204138.36796: dumping result to json 34589 1727204138.36798: done dumping result, returning 34589 1727204138.36803: done running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-ethtest0 [028d2410-947f-a9c6-cddc-000000000633] 34589 1727204138.36812: sending task result for task 028d2410-947f-a9c6-cddc-000000000633 34589 1727204138.36894: done sending task result for task 028d2410-947f-a9c6-cddc-000000000633 34589 1727204138.36897: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 34589 1727204138.36947: no more pending results, returning what we have 34589 1727204138.36950: results queue empty 34589 1727204138.36951: checking for any_errors_fatal 34589 1727204138.36958: done checking for any_errors_fatal 34589 1727204138.36958: checking for max_fail_percentage 34589 1727204138.36960: done checking for max_fail_percentage 34589 1727204138.36961: checking to see if all hosts have failed and the running result is not ok 34589 1727204138.36962: done checking to see if all hosts have failed 34589 1727204138.36963: getting the remaining hosts for this loop 34589 1727204138.36964: done getting the remaining hosts for this loop 34589 1727204138.36968: getting the next task for host managed-node1 34589 1727204138.36978: done getting next task for host managed-node1 34589 1727204138.36980: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 34589 1727204138.36984: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204138.36988: getting variables 34589 1727204138.36990: in VariableManager get_vars() 34589 1727204138.37021: Calling all_inventory to load vars for managed-node1 34589 1727204138.37023: Calling groups_inventory to load vars for managed-node1 34589 1727204138.37029: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204138.37039: Calling all_plugins_play to load vars for managed-node1 34589 1727204138.37041: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204138.37044: Calling groups_plugins_play to load vars for managed-node1 34589 1727204138.37840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204138.39319: done with get_vars() 34589 1727204138.39334: done getting variables 34589 1727204138.39376: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34589 1727204138.39458: variable 'profile' from source: include params 34589 1727204138.39461: variable 'interface' from source: set_fact 34589 1727204138.39501: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'ethtest0'] ************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Tuesday 24 September 2024 14:55:38 -0400 (0:00:00.041) 0:00:38.530 ***** 34589 1727204138.39526: entering _queue_task() for managed-node1/assert 34589 1727204138.39757: worker is 1 (out of 1 available) 34589 1727204138.39772: exiting _queue_task() for managed-node1/assert 34589 1727204138.39787: done queuing things up, now waiting for results queue to drain 34589 1727204138.39788: waiting for pending results... 34589 1727204138.39954: running TaskExecutor() for managed-node1/TASK: Assert that the profile is absent - 'ethtest0' 34589 1727204138.40031: in run() - task 028d2410-947f-a9c6-cddc-000000000613 34589 1727204138.40042: variable 'ansible_search_path' from source: unknown 34589 1727204138.40046: variable 'ansible_search_path' from source: unknown 34589 1727204138.40074: calling self._execute() 34589 1727204138.40149: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204138.40154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204138.40162: variable 'omit' from source: magic vars 34589 1727204138.40420: variable 'ansible_distribution_major_version' from source: facts 34589 1727204138.40429: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204138.40435: variable 'omit' from source: magic vars 34589 1727204138.40469: variable 'omit' from source: magic vars 34589 1727204138.40536: variable 'profile' from source: include params 34589 1727204138.40540: variable 'interface' from source: set_fact 34589 1727204138.40588: variable 'interface' from source: set_fact 34589 1727204138.40602: variable 'omit' from source: magic vars 34589 1727204138.40634: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204138.40659: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204138.40679: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204138.40692: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204138.40703: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204138.40727: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204138.40730: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204138.40733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204138.40806: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204138.40811: Set connection var ansible_shell_executable to /bin/sh 34589 1727204138.40825: Set connection var ansible_timeout to 10 34589 1727204138.40827: Set connection var ansible_shell_type to sh 34589 1727204138.40830: Set connection var ansible_connection to ssh 34589 1727204138.40847: Set connection var ansible_pipelining to False 34589 1727204138.40864: variable 'ansible_shell_executable' from source: unknown 34589 1727204138.40867: variable 'ansible_connection' from source: unknown 34589 1727204138.40870: variable 'ansible_module_compression' from source: unknown 34589 1727204138.40872: variable 'ansible_shell_type' from source: unknown 34589 1727204138.40874: variable 'ansible_shell_executable' from source: unknown 34589 1727204138.40878: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204138.40882: variable 'ansible_pipelining' from source: unknown 34589 1727204138.40924: variable 'ansible_timeout' from source: unknown 34589 1727204138.40927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204138.41194: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204138.41197: variable 'omit' from source: magic vars 34589 1727204138.41200: starting attempt loop 34589 1727204138.41203: running the handler 34589 1727204138.41205: variable 'lsr_net_profile_exists' from source: set_fact 34589 1727204138.41208: Evaluated conditional (not lsr_net_profile_exists): True 34589 1727204138.41210: handler run complete 34589 1727204138.41212: attempt loop complete, returning result 34589 1727204138.41214: _execute() done 34589 1727204138.41216: dumping result to json 34589 1727204138.41219: done dumping result, returning 34589 1727204138.41221: done running TaskExecutor() for managed-node1/TASK: Assert that the profile is absent - 'ethtest0' [028d2410-947f-a9c6-cddc-000000000613] 34589 1727204138.41223: sending task result for task 028d2410-947f-a9c6-cddc-000000000613 34589 1727204138.41303: done sending task result for task 028d2410-947f-a9c6-cddc-000000000613 34589 1727204138.41306: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 34589 1727204138.41378: no more pending results, returning what we have 34589 1727204138.41382: results queue empty 34589 1727204138.41383: checking for any_errors_fatal 34589 1727204138.41388: done checking for any_errors_fatal 34589 1727204138.41389: checking for max_fail_percentage 34589 1727204138.41390: done checking for max_fail_percentage 34589 1727204138.41392: checking to see if all hosts have failed and the running result is not ok 34589 1727204138.41392: done checking to see if all hosts have failed 34589 1727204138.41393: getting the remaining hosts for this loop 34589 1727204138.41394: done getting the remaining hosts for this loop 34589 1727204138.41397: getting the next task for host managed-node1 34589 1727204138.41403: done getting next task for host managed-node1 34589 1727204138.41406: ^ task is: TASK: Include the task 'assert_device_absent.yml' 34589 1727204138.41408: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204138.41411: getting variables 34589 1727204138.41413: in VariableManager get_vars() 34589 1727204138.41440: Calling all_inventory to load vars for managed-node1 34589 1727204138.41442: Calling groups_inventory to load vars for managed-node1 34589 1727204138.41446: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204138.41455: Calling all_plugins_play to load vars for managed-node1 34589 1727204138.41457: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204138.41460: Calling groups_plugins_play to load vars for managed-node1 34589 1727204138.42732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204138.43623: done with get_vars() 34589 1727204138.43639: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:89 Tuesday 24 September 2024 14:55:38 -0400 (0:00:00.041) 0:00:38.571 ***** 34589 1727204138.43702: entering _queue_task() for managed-node1/include_tasks 34589 1727204138.43933: worker is 1 (out of 1 available) 34589 1727204138.43948: exiting _queue_task() for managed-node1/include_tasks 34589 1727204138.43959: done queuing things up, now waiting for results queue to drain 34589 1727204138.43961: waiting for pending results... 34589 1727204138.44371: running TaskExecutor() for managed-node1/TASK: Include the task 'assert_device_absent.yml' 34589 1727204138.44386: in run() - task 028d2410-947f-a9c6-cddc-00000000009e 34589 1727204138.44582: variable 'ansible_search_path' from source: unknown 34589 1727204138.44764: calling self._execute() 34589 1727204138.44961: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204138.44966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204138.44982: variable 'omit' from source: magic vars 34589 1727204138.45682: variable 'ansible_distribution_major_version' from source: facts 34589 1727204138.45686: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204138.45689: _execute() done 34589 1727204138.45691: dumping result to json 34589 1727204138.45694: done dumping result, returning 34589 1727204138.45697: done running TaskExecutor() for managed-node1/TASK: Include the task 'assert_device_absent.yml' [028d2410-947f-a9c6-cddc-00000000009e] 34589 1727204138.45699: sending task result for task 028d2410-947f-a9c6-cddc-00000000009e 34589 1727204138.45769: done sending task result for task 028d2410-947f-a9c6-cddc-00000000009e 34589 1727204138.45772: WORKER PROCESS EXITING 34589 1727204138.45806: no more pending results, returning what we have 34589 1727204138.45812: in VariableManager get_vars() 34589 1727204138.45848: Calling all_inventory to load vars for managed-node1 34589 1727204138.45851: Calling groups_inventory to load vars for managed-node1 34589 1727204138.45856: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204138.45871: Calling all_plugins_play to load vars for managed-node1 34589 1727204138.45874: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204138.45884: Calling groups_plugins_play to load vars for managed-node1 34589 1727204138.48031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204138.49778: done with get_vars() 34589 1727204138.49800: variable 'ansible_search_path' from source: unknown 34589 1727204138.49816: we have included files to process 34589 1727204138.49817: generating all_blocks data 34589 1727204138.49819: done generating all_blocks data 34589 1727204138.49824: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 34589 1727204138.49825: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 34589 1727204138.49828: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 34589 1727204138.50010: in VariableManager get_vars() 34589 1727204138.50026: done with get_vars() 34589 1727204138.50189: done processing included file 34589 1727204138.50196: iterating over new_blocks loaded from include file 34589 1727204138.50198: in VariableManager get_vars() 34589 1727204138.50226: done with get_vars() 34589 1727204138.50228: filtering new block on tags 34589 1727204138.50245: done filtering new block on tags 34589 1727204138.50248: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node1 34589 1727204138.50253: extending task lists for all hosts with included blocks 34589 1727204138.50497: done extending task lists 34589 1727204138.50498: done processing included files 34589 1727204138.50499: results queue empty 34589 1727204138.50499: checking for any_errors_fatal 34589 1727204138.50502: done checking for any_errors_fatal 34589 1727204138.50502: checking for max_fail_percentage 34589 1727204138.50503: done checking for max_fail_percentage 34589 1727204138.50503: checking to see if all hosts have failed and the running result is not ok 34589 1727204138.50504: done checking to see if all hosts have failed 34589 1727204138.50504: getting the remaining hosts for this loop 34589 1727204138.50505: done getting the remaining hosts for this loop 34589 1727204138.50507: getting the next task for host managed-node1 34589 1727204138.50510: done getting next task for host managed-node1 34589 1727204138.50511: ^ task is: TASK: Include the task 'get_interface_stat.yml' 34589 1727204138.50513: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204138.50514: getting variables 34589 1727204138.50515: in VariableManager get_vars() 34589 1727204138.50521: Calling all_inventory to load vars for managed-node1 34589 1727204138.50523: Calling groups_inventory to load vars for managed-node1 34589 1727204138.50524: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204138.50528: Calling all_plugins_play to load vars for managed-node1 34589 1727204138.50529: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204138.50531: Calling groups_plugins_play to load vars for managed-node1 34589 1727204138.51207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204138.52060: done with get_vars() 34589 1727204138.52078: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 14:55:38 -0400 (0:00:00.084) 0:00:38.656 ***** 34589 1727204138.52129: entering _queue_task() for managed-node1/include_tasks 34589 1727204138.52388: worker is 1 (out of 1 available) 34589 1727204138.52402: exiting _queue_task() for managed-node1/include_tasks 34589 1727204138.52414: done queuing things up, now waiting for results queue to drain 34589 1727204138.52416: waiting for pending results... 34589 1727204138.52598: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 34589 1727204138.52672: in run() - task 028d2410-947f-a9c6-cddc-000000000664 34589 1727204138.52685: variable 'ansible_search_path' from source: unknown 34589 1727204138.52688: variable 'ansible_search_path' from source: unknown 34589 1727204138.52720: calling self._execute() 34589 1727204138.52797: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204138.52801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204138.52813: variable 'omit' from source: magic vars 34589 1727204138.53099: variable 'ansible_distribution_major_version' from source: facts 34589 1727204138.53111: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204138.53118: _execute() done 34589 1727204138.53122: dumping result to json 34589 1727204138.53124: done dumping result, returning 34589 1727204138.53130: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [028d2410-947f-a9c6-cddc-000000000664] 34589 1727204138.53134: sending task result for task 028d2410-947f-a9c6-cddc-000000000664 34589 1727204138.53217: done sending task result for task 028d2410-947f-a9c6-cddc-000000000664 34589 1727204138.53220: WORKER PROCESS EXITING 34589 1727204138.53249: no more pending results, returning what we have 34589 1727204138.53254: in VariableManager get_vars() 34589 1727204138.53288: Calling all_inventory to load vars for managed-node1 34589 1727204138.53291: Calling groups_inventory to load vars for managed-node1 34589 1727204138.53295: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204138.53309: Calling all_plugins_play to load vars for managed-node1 34589 1727204138.53312: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204138.53314: Calling groups_plugins_play to load vars for managed-node1 34589 1727204138.54211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204138.55084: done with get_vars() 34589 1727204138.55097: variable 'ansible_search_path' from source: unknown 34589 1727204138.55098: variable 'ansible_search_path' from source: unknown 34589 1727204138.55125: we have included files to process 34589 1727204138.55126: generating all_blocks data 34589 1727204138.55126: done generating all_blocks data 34589 1727204138.55127: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 34589 1727204138.55128: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 34589 1727204138.55129: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 34589 1727204138.55254: done processing included file 34589 1727204138.55255: iterating over new_blocks loaded from include file 34589 1727204138.55256: in VariableManager get_vars() 34589 1727204138.55265: done with get_vars() 34589 1727204138.55265: filtering new block on tags 34589 1727204138.55275: done filtering new block on tags 34589 1727204138.55278: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 34589 1727204138.55282: extending task lists for all hosts with included blocks 34589 1727204138.55340: done extending task lists 34589 1727204138.55341: done processing included files 34589 1727204138.55342: results queue empty 34589 1727204138.55342: checking for any_errors_fatal 34589 1727204138.55344: done checking for any_errors_fatal 34589 1727204138.55345: checking for max_fail_percentage 34589 1727204138.55345: done checking for max_fail_percentage 34589 1727204138.55346: checking to see if all hosts have failed and the running result is not ok 34589 1727204138.55346: done checking to see if all hosts have failed 34589 1727204138.55347: getting the remaining hosts for this loop 34589 1727204138.55348: done getting the remaining hosts for this loop 34589 1727204138.55349: getting the next task for host managed-node1 34589 1727204138.55352: done getting next task for host managed-node1 34589 1727204138.55353: ^ task is: TASK: Get stat for interface {{ interface }} 34589 1727204138.55355: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204138.55357: getting variables 34589 1727204138.55357: in VariableManager get_vars() 34589 1727204138.55363: Calling all_inventory to load vars for managed-node1 34589 1727204138.55365: Calling groups_inventory to load vars for managed-node1 34589 1727204138.55366: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204138.55370: Calling all_plugins_play to load vars for managed-node1 34589 1727204138.55371: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204138.55373: Calling groups_plugins_play to load vars for managed-node1 34589 1727204138.56021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204138.56919: done with get_vars() 34589 1727204138.56933: done getting variables 34589 1727204138.57047: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:55:38 -0400 (0:00:00.049) 0:00:38.705 ***** 34589 1727204138.57069: entering _queue_task() for managed-node1/stat 34589 1727204138.57323: worker is 1 (out of 1 available) 34589 1727204138.57337: exiting _queue_task() for managed-node1/stat 34589 1727204138.57350: done queuing things up, now waiting for results queue to drain 34589 1727204138.57351: waiting for pending results... 34589 1727204138.57528: running TaskExecutor() for managed-node1/TASK: Get stat for interface ethtest0 34589 1727204138.57607: in run() - task 028d2410-947f-a9c6-cddc-000000000687 34589 1727204138.57621: variable 'ansible_search_path' from source: unknown 34589 1727204138.57625: variable 'ansible_search_path' from source: unknown 34589 1727204138.57650: calling self._execute() 34589 1727204138.57727: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204138.57730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204138.57740: variable 'omit' from source: magic vars 34589 1727204138.58010: variable 'ansible_distribution_major_version' from source: facts 34589 1727204138.58020: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204138.58025: variable 'omit' from source: magic vars 34589 1727204138.58059: variable 'omit' from source: magic vars 34589 1727204138.58131: variable 'interface' from source: set_fact 34589 1727204138.58145: variable 'omit' from source: magic vars 34589 1727204138.58178: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204138.58205: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204138.58225: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204138.58238: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204138.58247: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204138.58270: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204138.58273: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204138.58277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204138.58350: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204138.58353: Set connection var ansible_shell_executable to /bin/sh 34589 1727204138.58361: Set connection var ansible_timeout to 10 34589 1727204138.58364: Set connection var ansible_shell_type to sh 34589 1727204138.58369: Set connection var ansible_connection to ssh 34589 1727204138.58374: Set connection var ansible_pipelining to False 34589 1727204138.58391: variable 'ansible_shell_executable' from source: unknown 34589 1727204138.58394: variable 'ansible_connection' from source: unknown 34589 1727204138.58397: variable 'ansible_module_compression' from source: unknown 34589 1727204138.58399: variable 'ansible_shell_type' from source: unknown 34589 1727204138.58401: variable 'ansible_shell_executable' from source: unknown 34589 1727204138.58403: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204138.58407: variable 'ansible_pipelining' from source: unknown 34589 1727204138.58413: variable 'ansible_timeout' from source: unknown 34589 1727204138.58417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204138.58562: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34589 1727204138.58571: variable 'omit' from source: magic vars 34589 1727204138.58579: starting attempt loop 34589 1727204138.58582: running the handler 34589 1727204138.58594: _low_level_execute_command(): starting 34589 1727204138.58601: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204138.59130: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204138.59135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 34589 1727204138.59137: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204138.59139: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204138.59188: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204138.59191: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204138.59289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204138.61092: stdout chunk (state=3): >>>/root <<< 34589 1727204138.61220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204138.61223: stdout chunk (state=3): >>><<< 34589 1727204138.61231: stderr chunk (state=3): >>><<< 34589 1727204138.61249: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204138.61262: _low_level_execute_command(): starting 34589 1727204138.61267: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204138.612501-37719-77266835041995 `" && echo ansible-tmp-1727204138.612501-37719-77266835041995="` echo /root/.ansible/tmp/ansible-tmp-1727204138.612501-37719-77266835041995 `" ) && sleep 0' 34589 1727204138.61735: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204138.61738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204138.61741: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204138.61750: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204138.61753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204138.61796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204138.61803: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204138.61806: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204138.61887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204138.64013: stdout chunk (state=3): >>>ansible-tmp-1727204138.612501-37719-77266835041995=/root/.ansible/tmp/ansible-tmp-1727204138.612501-37719-77266835041995 <<< 34589 1727204138.64124: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204138.64150: stderr chunk (state=3): >>><<< 34589 1727204138.64153: stdout chunk (state=3): >>><<< 34589 1727204138.64168: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204138.612501-37719-77266835041995=/root/.ansible/tmp/ansible-tmp-1727204138.612501-37719-77266835041995 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204138.64213: variable 'ansible_module_compression' from source: unknown 34589 1727204138.64256: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 34589 1727204138.64289: variable 'ansible_facts' from source: unknown 34589 1727204138.64343: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204138.612501-37719-77266835041995/AnsiballZ_stat.py 34589 1727204138.64443: Sending initial data 34589 1727204138.64446: Sent initial data (151 bytes) 34589 1727204138.64859: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204138.64898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204138.64901: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204138.64903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204138.64905: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204138.64910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204138.64954: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204138.64957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204138.64961: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204138.65040: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204138.66777: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 34589 1727204138.66782: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204138.66851: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204138.66928: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmpwlen15z1 /root/.ansible/tmp/ansible-tmp-1727204138.612501-37719-77266835041995/AnsiballZ_stat.py <<< 34589 1727204138.66933: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204138.612501-37719-77266835041995/AnsiballZ_stat.py" <<< 34589 1727204138.67006: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmpwlen15z1" to remote "/root/.ansible/tmp/ansible-tmp-1727204138.612501-37719-77266835041995/AnsiballZ_stat.py" <<< 34589 1727204138.67009: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204138.612501-37719-77266835041995/AnsiballZ_stat.py" <<< 34589 1727204138.68369: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204138.68373: stderr chunk (state=3): >>><<< 34589 1727204138.68377: stdout chunk (state=3): >>><<< 34589 1727204138.68396: done transferring module to remote 34589 1727204138.68435: _low_level_execute_command(): starting 34589 1727204138.68439: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204138.612501-37719-77266835041995/ /root/.ansible/tmp/ansible-tmp-1727204138.612501-37719-77266835041995/AnsiballZ_stat.py && sleep 0' 34589 1727204138.69124: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204138.69135: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204138.69252: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204138.71356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204138.71359: stdout chunk (state=3): >>><<< 34589 1727204138.71362: stderr chunk (state=3): >>><<< 34589 1727204138.71364: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204138.71367: _low_level_execute_command(): starting 34589 1727204138.71369: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204138.612501-37719-77266835041995/AnsiballZ_stat.py && sleep 0' 34589 1727204138.71888: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204138.71903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204138.71922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204138.71941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204138.71957: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204138.71969: stderr chunk (state=3): >>>debug2: match not found <<< 34589 1727204138.71985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204138.72044: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204138.72094: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204138.72116: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204138.72170: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204138.72368: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204138.88913: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 34589 1727204138.90489: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204138.90540: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 34589 1727204138.90592: stderr chunk (state=3): >>><<< 34589 1727204138.90717: stdout chunk (state=3): >>><<< 34589 1727204138.90720: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204138.90742: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204138.612501-37719-77266835041995/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204138.90784: _low_level_execute_command(): starting 34589 1727204138.90794: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204138.612501-37719-77266835041995/ > /dev/null 2>&1 && sleep 0' 34589 1727204138.91691: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204138.91747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204138.91764: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204138.91795: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204138.92072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204138.94181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204138.94185: stdout chunk (state=3): >>><<< 34589 1727204138.94188: stderr chunk (state=3): >>><<< 34589 1727204138.94191: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204138.94195: handler run complete 34589 1727204138.94198: attempt loop complete, returning result 34589 1727204138.94200: _execute() done 34589 1727204138.94202: dumping result to json 34589 1727204138.94205: done dumping result, returning 34589 1727204138.94211: done running TaskExecutor() for managed-node1/TASK: Get stat for interface ethtest0 [028d2410-947f-a9c6-cddc-000000000687] 34589 1727204138.94214: sending task result for task 028d2410-947f-a9c6-cddc-000000000687 34589 1727204138.94317: done sending task result for task 028d2410-947f-a9c6-cddc-000000000687 34589 1727204138.94321: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 34589 1727204138.94481: no more pending results, returning what we have 34589 1727204138.94486: results queue empty 34589 1727204138.94487: checking for any_errors_fatal 34589 1727204138.94489: done checking for any_errors_fatal 34589 1727204138.94489: checking for max_fail_percentage 34589 1727204138.94491: done checking for max_fail_percentage 34589 1727204138.94492: checking to see if all hosts have failed and the running result is not ok 34589 1727204138.94495: done checking to see if all hosts have failed 34589 1727204138.94495: getting the remaining hosts for this loop 34589 1727204138.94497: done getting the remaining hosts for this loop 34589 1727204138.94502: getting the next task for host managed-node1 34589 1727204138.94510: done getting next task for host managed-node1 34589 1727204138.94513: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 34589 1727204138.94516: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204138.94521: getting variables 34589 1727204138.94523: in VariableManager get_vars() 34589 1727204138.94555: Calling all_inventory to load vars for managed-node1 34589 1727204138.94558: Calling groups_inventory to load vars for managed-node1 34589 1727204138.94562: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204138.94574: Calling all_plugins_play to load vars for managed-node1 34589 1727204138.94682: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204138.94688: Calling groups_plugins_play to load vars for managed-node1 34589 1727204138.96680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204138.98599: done with get_vars() 34589 1727204138.98624: done getting variables 34589 1727204138.98692: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 34589 1727204138.98822: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest0'] ************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 14:55:38 -0400 (0:00:00.417) 0:00:39.123 ***** 34589 1727204138.98853: entering _queue_task() for managed-node1/assert 34589 1727204138.99200: worker is 1 (out of 1 available) 34589 1727204138.99213: exiting _queue_task() for managed-node1/assert 34589 1727204138.99336: done queuing things up, now waiting for results queue to drain 34589 1727204138.99338: waiting for pending results... 34589 1727204138.99694: running TaskExecutor() for managed-node1/TASK: Assert that the interface is absent - 'ethtest0' 34589 1727204138.99699: in run() - task 028d2410-947f-a9c6-cddc-000000000665 34589 1727204138.99702: variable 'ansible_search_path' from source: unknown 34589 1727204138.99705: variable 'ansible_search_path' from source: unknown 34589 1727204138.99710: calling self._execute() 34589 1727204138.99924: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204138.99929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204138.99940: variable 'omit' from source: magic vars 34589 1727204139.00380: variable 'ansible_distribution_major_version' from source: facts 34589 1727204139.00384: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204139.00387: variable 'omit' from source: magic vars 34589 1727204139.00390: variable 'omit' from source: magic vars 34589 1727204139.00485: variable 'interface' from source: set_fact 34589 1727204139.00780: variable 'omit' from source: magic vars 34589 1727204139.00789: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204139.00797: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204139.00800: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204139.00802: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204139.00804: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204139.00806: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204139.00815: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204139.00818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204139.00820: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204139.00822: Set connection var ansible_shell_executable to /bin/sh 34589 1727204139.00824: Set connection var ansible_timeout to 10 34589 1727204139.00826: Set connection var ansible_shell_type to sh 34589 1727204139.00828: Set connection var ansible_connection to ssh 34589 1727204139.00830: Set connection var ansible_pipelining to False 34589 1727204139.00854: variable 'ansible_shell_executable' from source: unknown 34589 1727204139.00857: variable 'ansible_connection' from source: unknown 34589 1727204139.00860: variable 'ansible_module_compression' from source: unknown 34589 1727204139.00862: variable 'ansible_shell_type' from source: unknown 34589 1727204139.00865: variable 'ansible_shell_executable' from source: unknown 34589 1727204139.00867: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204139.00869: variable 'ansible_pipelining' from source: unknown 34589 1727204139.00873: variable 'ansible_timeout' from source: unknown 34589 1727204139.00877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204139.01027: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204139.01037: variable 'omit' from source: magic vars 34589 1727204139.01043: starting attempt loop 34589 1727204139.01047: running the handler 34589 1727204139.01280: variable 'interface_stat' from source: set_fact 34589 1727204139.01285: Evaluated conditional (not interface_stat.stat.exists): True 34589 1727204139.01292: handler run complete 34589 1727204139.01295: attempt loop complete, returning result 34589 1727204139.01298: _execute() done 34589 1727204139.01301: dumping result to json 34589 1727204139.01303: done dumping result, returning 34589 1727204139.01306: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is absent - 'ethtest0' [028d2410-947f-a9c6-cddc-000000000665] 34589 1727204139.01310: sending task result for task 028d2410-947f-a9c6-cddc-000000000665 34589 1727204139.01373: done sending task result for task 028d2410-947f-a9c6-cddc-000000000665 34589 1727204139.01378: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 34589 1727204139.01435: no more pending results, returning what we have 34589 1727204139.01439: results queue empty 34589 1727204139.01440: checking for any_errors_fatal 34589 1727204139.01451: done checking for any_errors_fatal 34589 1727204139.01452: checking for max_fail_percentage 34589 1727204139.01453: done checking for max_fail_percentage 34589 1727204139.01454: checking to see if all hosts have failed and the running result is not ok 34589 1727204139.01456: done checking to see if all hosts have failed 34589 1727204139.01457: getting the remaining hosts for this loop 34589 1727204139.01458: done getting the remaining hosts for this loop 34589 1727204139.01462: getting the next task for host managed-node1 34589 1727204139.01470: done getting next task for host managed-node1 34589 1727204139.01473: ^ task is: TASK: Verify network state restored to default 34589 1727204139.01477: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204139.01481: getting variables 34589 1727204139.01484: in VariableManager get_vars() 34589 1727204139.01516: Calling all_inventory to load vars for managed-node1 34589 1727204139.01520: Calling groups_inventory to load vars for managed-node1 34589 1727204139.01524: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204139.01537: Calling all_plugins_play to load vars for managed-node1 34589 1727204139.01540: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204139.01543: Calling groups_plugins_play to load vars for managed-node1 34589 1727204139.04262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204139.06207: done with get_vars() 34589 1727204139.06234: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:91 Tuesday 24 September 2024 14:55:39 -0400 (0:00:00.074) 0:00:39.198 ***** 34589 1727204139.06335: entering _queue_task() for managed-node1/include_tasks 34589 1727204139.06693: worker is 1 (out of 1 available) 34589 1727204139.06706: exiting _queue_task() for managed-node1/include_tasks 34589 1727204139.06939: done queuing things up, now waiting for results queue to drain 34589 1727204139.06941: waiting for pending results... 34589 1727204139.07123: running TaskExecutor() for managed-node1/TASK: Verify network state restored to default 34589 1727204139.07226: in run() - task 028d2410-947f-a9c6-cddc-00000000009f 34589 1727204139.07238: variable 'ansible_search_path' from source: unknown 34589 1727204139.07281: calling self._execute() 34589 1727204139.07370: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204139.07383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204139.07391: variable 'omit' from source: magic vars 34589 1727204139.07798: variable 'ansible_distribution_major_version' from source: facts 34589 1727204139.07817: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204139.07824: _execute() done 34589 1727204139.07827: dumping result to json 34589 1727204139.07830: done dumping result, returning 34589 1727204139.07836: done running TaskExecutor() for managed-node1/TASK: Verify network state restored to default [028d2410-947f-a9c6-cddc-00000000009f] 34589 1727204139.07842: sending task result for task 028d2410-947f-a9c6-cddc-00000000009f 34589 1727204139.07936: done sending task result for task 028d2410-947f-a9c6-cddc-00000000009f 34589 1727204139.07940: WORKER PROCESS EXITING 34589 1727204139.07972: no more pending results, returning what we have 34589 1727204139.07979: in VariableManager get_vars() 34589 1727204139.08017: Calling all_inventory to load vars for managed-node1 34589 1727204139.08022: Calling groups_inventory to load vars for managed-node1 34589 1727204139.08027: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204139.08043: Calling all_plugins_play to load vars for managed-node1 34589 1727204139.08046: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204139.08049: Calling groups_plugins_play to load vars for managed-node1 34589 1727204139.09720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204139.11326: done with get_vars() 34589 1727204139.11347: variable 'ansible_search_path' from source: unknown 34589 1727204139.11362: we have included files to process 34589 1727204139.11364: generating all_blocks data 34589 1727204139.11365: done generating all_blocks data 34589 1727204139.11369: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 34589 1727204139.11370: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 34589 1727204139.11373: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 34589 1727204139.11791: done processing included file 34589 1727204139.11792: iterating over new_blocks loaded from include file 34589 1727204139.11794: in VariableManager get_vars() 34589 1727204139.11805: done with get_vars() 34589 1727204139.11807: filtering new block on tags 34589 1727204139.11823: done filtering new block on tags 34589 1727204139.11825: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node1 34589 1727204139.11830: extending task lists for all hosts with included blocks 34589 1727204139.12164: done extending task lists 34589 1727204139.12166: done processing included files 34589 1727204139.12166: results queue empty 34589 1727204139.12167: checking for any_errors_fatal 34589 1727204139.12170: done checking for any_errors_fatal 34589 1727204139.12171: checking for max_fail_percentage 34589 1727204139.12172: done checking for max_fail_percentage 34589 1727204139.12173: checking to see if all hosts have failed and the running result is not ok 34589 1727204139.12174: done checking to see if all hosts have failed 34589 1727204139.12174: getting the remaining hosts for this loop 34589 1727204139.12177: done getting the remaining hosts for this loop 34589 1727204139.12180: getting the next task for host managed-node1 34589 1727204139.12183: done getting next task for host managed-node1 34589 1727204139.12190: ^ task is: TASK: Check routes and DNS 34589 1727204139.12192: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204139.12194: getting variables 34589 1727204139.12195: in VariableManager get_vars() 34589 1727204139.12204: Calling all_inventory to load vars for managed-node1 34589 1727204139.12206: Calling groups_inventory to load vars for managed-node1 34589 1727204139.12208: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204139.12213: Calling all_plugins_play to load vars for managed-node1 34589 1727204139.12215: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204139.12218: Calling groups_plugins_play to load vars for managed-node1 34589 1727204139.13469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204139.15017: done with get_vars() 34589 1727204139.15039: done getting variables 34589 1727204139.15080: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 14:55:39 -0400 (0:00:00.087) 0:00:39.286 ***** 34589 1727204139.15115: entering _queue_task() for managed-node1/shell 34589 1727204139.15467: worker is 1 (out of 1 available) 34589 1727204139.15481: exiting _queue_task() for managed-node1/shell 34589 1727204139.15495: done queuing things up, now waiting for results queue to drain 34589 1727204139.15496: waiting for pending results... 34589 1727204139.15893: running TaskExecutor() for managed-node1/TASK: Check routes and DNS 34589 1727204139.15898: in run() - task 028d2410-947f-a9c6-cddc-00000000069f 34589 1727204139.15902: variable 'ansible_search_path' from source: unknown 34589 1727204139.15904: variable 'ansible_search_path' from source: unknown 34589 1727204139.15910: calling self._execute() 34589 1727204139.16001: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204139.16081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204139.16084: variable 'omit' from source: magic vars 34589 1727204139.16387: variable 'ansible_distribution_major_version' from source: facts 34589 1727204139.16403: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204139.16417: variable 'omit' from source: magic vars 34589 1727204139.16457: variable 'omit' from source: magic vars 34589 1727204139.16496: variable 'omit' from source: magic vars 34589 1727204139.16541: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204139.16581: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204139.16603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204139.16627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204139.16642: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204139.16673: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204139.16880: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204139.16883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204139.16886: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204139.16888: Set connection var ansible_shell_executable to /bin/sh 34589 1727204139.16890: Set connection var ansible_timeout to 10 34589 1727204139.16892: Set connection var ansible_shell_type to sh 34589 1727204139.16893: Set connection var ansible_connection to ssh 34589 1727204139.16895: Set connection var ansible_pipelining to False 34589 1727204139.16897: variable 'ansible_shell_executable' from source: unknown 34589 1727204139.16899: variable 'ansible_connection' from source: unknown 34589 1727204139.16901: variable 'ansible_module_compression' from source: unknown 34589 1727204139.16903: variable 'ansible_shell_type' from source: unknown 34589 1727204139.16905: variable 'ansible_shell_executable' from source: unknown 34589 1727204139.16909: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204139.16911: variable 'ansible_pipelining' from source: unknown 34589 1727204139.16913: variable 'ansible_timeout' from source: unknown 34589 1727204139.16915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204139.17024: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204139.17044: variable 'omit' from source: magic vars 34589 1727204139.17053: starting attempt loop 34589 1727204139.17059: running the handler 34589 1727204139.17073: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204139.17099: _low_level_execute_command(): starting 34589 1727204139.17115: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204139.17894: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204139.17921: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204139.17933: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204139.18046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204139.19857: stdout chunk (state=3): >>>/root <<< 34589 1727204139.19999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204139.20014: stdout chunk (state=3): >>><<< 34589 1727204139.20031: stderr chunk (state=3): >>><<< 34589 1727204139.20056: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204139.20077: _low_level_execute_command(): starting 34589 1727204139.20087: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204139.2006326-37744-184960689852914 `" && echo ansible-tmp-1727204139.2006326-37744-184960689852914="` echo /root/.ansible/tmp/ansible-tmp-1727204139.2006326-37744-184960689852914 `" ) && sleep 0' 34589 1727204139.20742: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204139.20758: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204139.20779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204139.20853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204139.20922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204139.20954: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204139.20973: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204139.21101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204139.23221: stdout chunk (state=3): >>>ansible-tmp-1727204139.2006326-37744-184960689852914=/root/.ansible/tmp/ansible-tmp-1727204139.2006326-37744-184960689852914 <<< 34589 1727204139.23383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204139.23387: stdout chunk (state=3): >>><<< 34589 1727204139.23389: stderr chunk (state=3): >>><<< 34589 1727204139.23408: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204139.2006326-37744-184960689852914=/root/.ansible/tmp/ansible-tmp-1727204139.2006326-37744-184960689852914 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204139.23581: variable 'ansible_module_compression' from source: unknown 34589 1727204139.23585: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34589 1727204139.23587: variable 'ansible_facts' from source: unknown 34589 1727204139.23641: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204139.2006326-37744-184960689852914/AnsiballZ_command.py 34589 1727204139.23797: Sending initial data 34589 1727204139.23808: Sent initial data (156 bytes) 34589 1727204139.24479: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204139.24483: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204139.24546: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204139.24580: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204139.24614: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204139.24711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204139.26481: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204139.26558: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204139.26644: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmp87cwgypt /root/.ansible/tmp/ansible-tmp-1727204139.2006326-37744-184960689852914/AnsiballZ_command.py <<< 34589 1727204139.26648: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204139.2006326-37744-184960689852914/AnsiballZ_command.py" <<< 34589 1727204139.26727: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmp87cwgypt" to remote "/root/.ansible/tmp/ansible-tmp-1727204139.2006326-37744-184960689852914/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204139.2006326-37744-184960689852914/AnsiballZ_command.py" <<< 34589 1727204139.27595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204139.27646: stderr chunk (state=3): >>><<< 34589 1727204139.27780: stdout chunk (state=3): >>><<< 34589 1727204139.27783: done transferring module to remote 34589 1727204139.27785: _low_level_execute_command(): starting 34589 1727204139.27787: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204139.2006326-37744-184960689852914/ /root/.ansible/tmp/ansible-tmp-1727204139.2006326-37744-184960689852914/AnsiballZ_command.py && sleep 0' 34589 1727204139.28359: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204139.28375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204139.28394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204139.28430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204139.28445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204139.28494: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204139.28559: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204139.28588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204139.28715: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204139.30781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204139.30784: stdout chunk (state=3): >>><<< 34589 1727204139.30786: stderr chunk (state=3): >>><<< 34589 1727204139.30789: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204139.30791: _low_level_execute_command(): starting 34589 1727204139.30794: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204139.2006326-37744-184960689852914/AnsiballZ_command.py && sleep 0' 34589 1727204139.31445: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204139.31461: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204139.31578: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204139.49031: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:dd:89:9b:e5 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.14.47/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 2904sec preferred_lft 2904sec\n inet6 fe80::8ff:ddff:fe89:9be5/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.47 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.47 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:55:39.478984", "end": "2024-09-24 14:55:39.488397", "delta": "0:00:00.009413", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34589 1727204139.51084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204139.51090: stderr chunk (state=3): >>><<< 34589 1727204139.51095: stdout chunk (state=3): >>><<< 34589 1727204139.51184: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:dd:89:9b:e5 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.14.47/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 2904sec preferred_lft 2904sec\n inet6 fe80::8ff:ddff:fe89:9be5/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.47 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.47 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:55:39.478984", "end": "2024-09-24 14:55:39.488397", "delta": "0:00:00.009413", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204139.51228: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204139.2006326-37744-184960689852914/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204139.51237: _low_level_execute_command(): starting 34589 1727204139.51242: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204139.2006326-37744-184960689852914/ > /dev/null 2>&1 && sleep 0' 34589 1727204139.52938: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204139.52949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204139.53380: stderr chunk (state=3): >>>debug2: match not found <<< 34589 1727204139.53386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204139.53390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204139.53393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204139.53396: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204139.53399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204139.55568: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204139.55571: stdout chunk (state=3): >>><<< 34589 1727204139.55582: stderr chunk (state=3): >>><<< 34589 1727204139.55608: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204139.55618: handler run complete 34589 1727204139.55731: Evaluated conditional (False): False 34589 1727204139.55747: attempt loop complete, returning result 34589 1727204139.55750: _execute() done 34589 1727204139.55752: dumping result to json 34589 1727204139.55760: done dumping result, returning 34589 1727204139.55768: done running TaskExecutor() for managed-node1/TASK: Check routes and DNS [028d2410-947f-a9c6-cddc-00000000069f] 34589 1727204139.55773: sending task result for task 028d2410-947f-a9c6-cddc-00000000069f 34589 1727204139.56232: done sending task result for task 028d2410-947f-a9c6-cddc-00000000069f 34589 1727204139.56235: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009413", "end": "2024-09-24 14:55:39.488397", "rc": 0, "start": "2024-09-24 14:55:39.478984" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:dd:89:9b:e5 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.14.47/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 2904sec preferred_lft 2904sec inet6 fe80::8ff:ddff:fe89:9be5/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.47 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.47 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 34589 1727204139.56556: no more pending results, returning what we have 34589 1727204139.56561: results queue empty 34589 1727204139.56562: checking for any_errors_fatal 34589 1727204139.56563: done checking for any_errors_fatal 34589 1727204139.56564: checking for max_fail_percentage 34589 1727204139.56566: done checking for max_fail_percentage 34589 1727204139.56567: checking to see if all hosts have failed and the running result is not ok 34589 1727204139.56567: done checking to see if all hosts have failed 34589 1727204139.56568: getting the remaining hosts for this loop 34589 1727204139.56570: done getting the remaining hosts for this loop 34589 1727204139.56577: getting the next task for host managed-node1 34589 1727204139.56584: done getting next task for host managed-node1 34589 1727204139.56587: ^ task is: TASK: Verify DNS and network connectivity 34589 1727204139.56591: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204139.56594: getting variables 34589 1727204139.56596: in VariableManager get_vars() 34589 1727204139.56752: Calling all_inventory to load vars for managed-node1 34589 1727204139.56761: Calling groups_inventory to load vars for managed-node1 34589 1727204139.56766: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204139.56781: Calling all_plugins_play to load vars for managed-node1 34589 1727204139.56785: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204139.56789: Calling groups_plugins_play to load vars for managed-node1 34589 1727204139.60186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204139.63825: done with get_vars() 34589 1727204139.63857: done getting variables 34589 1727204139.63928: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 14:55:39 -0400 (0:00:00.488) 0:00:39.774 ***** 34589 1727204139.63959: entering _queue_task() for managed-node1/shell 34589 1727204139.64739: worker is 1 (out of 1 available) 34589 1727204139.64752: exiting _queue_task() for managed-node1/shell 34589 1727204139.64764: done queuing things up, now waiting for results queue to drain 34589 1727204139.64765: waiting for pending results... 34589 1727204139.65304: running TaskExecutor() for managed-node1/TASK: Verify DNS and network connectivity 34589 1727204139.65468: in run() - task 028d2410-947f-a9c6-cddc-0000000006a0 34589 1727204139.65491: variable 'ansible_search_path' from source: unknown 34589 1727204139.65498: variable 'ansible_search_path' from source: unknown 34589 1727204139.65661: calling self._execute() 34589 1727204139.65815: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204139.65912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204139.65928: variable 'omit' from source: magic vars 34589 1727204139.66298: variable 'ansible_distribution_major_version' from source: facts 34589 1727204139.66315: Evaluated conditional (ansible_distribution_major_version != '6'): True 34589 1727204139.66467: variable 'ansible_facts' from source: unknown 34589 1727204139.67444: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 34589 1727204139.67592: variable 'omit' from source: magic vars 34589 1727204139.67597: variable 'omit' from source: magic vars 34589 1727204139.67600: variable 'omit' from source: magic vars 34589 1727204139.67778: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34589 1727204139.67943: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34589 1727204139.68025: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34589 1727204139.68029: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204139.68031: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34589 1727204139.68045: variable 'inventory_hostname' from source: host vars for 'managed-node1' 34589 1727204139.68053: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204139.68060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204139.68194: Set connection var ansible_module_compression to ZIP_DEFLATED 34589 1727204139.68205: Set connection var ansible_shell_executable to /bin/sh 34589 1727204139.68223: Set connection var ansible_timeout to 10 34589 1727204139.68231: Set connection var ansible_shell_type to sh 34589 1727204139.68247: Set connection var ansible_connection to ssh 34589 1727204139.68255: Set connection var ansible_pipelining to False 34589 1727204139.68287: variable 'ansible_shell_executable' from source: unknown 34589 1727204139.68295: variable 'ansible_connection' from source: unknown 34589 1727204139.68304: variable 'ansible_module_compression' from source: unknown 34589 1727204139.68350: variable 'ansible_shell_type' from source: unknown 34589 1727204139.68353: variable 'ansible_shell_executable' from source: unknown 34589 1727204139.68355: variable 'ansible_host' from source: host vars for 'managed-node1' 34589 1727204139.68356: variable 'ansible_pipelining' from source: unknown 34589 1727204139.68358: variable 'ansible_timeout' from source: unknown 34589 1727204139.68360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 34589 1727204139.68493: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204139.68511: variable 'omit' from source: magic vars 34589 1727204139.68521: starting attempt loop 34589 1727204139.68527: running the handler 34589 1727204139.68568: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34589 1727204139.68571: _low_level_execute_command(): starting 34589 1727204139.68582: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34589 1727204139.69391: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204139.69450: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 34589 1727204139.69467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204139.69569: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204139.69573: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204139.69612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204139.69696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204139.71805: stdout chunk (state=3): >>>/root <<< 34589 1727204139.71808: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204139.71811: stdout chunk (state=3): >>><<< 34589 1727204139.71813: stderr chunk (state=3): >>><<< 34589 1727204139.71816: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204139.71819: _low_level_execute_command(): starting 34589 1727204139.71821: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204139.717571-37769-242987776111163 `" && echo ansible-tmp-1727204139.717571-37769-242987776111163="` echo /root/.ansible/tmp/ansible-tmp-1727204139.717571-37769-242987776111163 `" ) && sleep 0' 34589 1727204139.73099: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204139.73254: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204139.73318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204139.73486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204139.75589: stdout chunk (state=3): >>>ansible-tmp-1727204139.717571-37769-242987776111163=/root/.ansible/tmp/ansible-tmp-1727204139.717571-37769-242987776111163 <<< 34589 1727204139.75981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204139.75984: stdout chunk (state=3): >>><<< 34589 1727204139.75986: stderr chunk (state=3): >>><<< 34589 1727204139.75989: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204139.717571-37769-242987776111163=/root/.ansible/tmp/ansible-tmp-1727204139.717571-37769-242987776111163 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204139.75991: variable 'ansible_module_compression' from source: unknown 34589 1727204139.75993: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-345898w0jzzek/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 34589 1727204139.75994: variable 'ansible_facts' from source: unknown 34589 1727204139.76178: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204139.717571-37769-242987776111163/AnsiballZ_command.py 34589 1727204139.76360: Sending initial data 34589 1727204139.76404: Sent initial data (155 bytes) 34589 1727204139.77398: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204139.77466: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204139.77526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204139.77544: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34589 1727204139.77547: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204139.77697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204139.79403: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34589 1727204139.79473: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34589 1727204139.79549: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-345898w0jzzek/tmp1bo6mqvs /root/.ansible/tmp/ansible-tmp-1727204139.717571-37769-242987776111163/AnsiballZ_command.py <<< 34589 1727204139.79553: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204139.717571-37769-242987776111163/AnsiballZ_command.py" <<< 34589 1727204139.79633: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-345898w0jzzek/tmp1bo6mqvs" to remote "/root/.ansible/tmp/ansible-tmp-1727204139.717571-37769-242987776111163/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204139.717571-37769-242987776111163/AnsiballZ_command.py" <<< 34589 1727204139.80958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204139.80970: stderr chunk (state=3): >>><<< 34589 1727204139.80973: stdout chunk (state=3): >>><<< 34589 1727204139.81026: done transferring module to remote 34589 1727204139.81029: _low_level_execute_command(): starting 34589 1727204139.81031: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204139.717571-37769-242987776111163/ /root/.ansible/tmp/ansible-tmp-1727204139.717571-37769-242987776111163/AnsiballZ_command.py && sleep 0' 34589 1727204139.81450: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204139.81483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34589 1727204139.81486: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 34589 1727204139.81488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204139.81492: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204139.81494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204139.81540: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204139.81543: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204139.81631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204139.83755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204139.83759: stdout chunk (state=3): >>><<< 34589 1727204139.83761: stderr chunk (state=3): >>><<< 34589 1727204139.83763: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204139.83766: _low_level_execute_command(): starting 34589 1727204139.83772: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204139.717571-37769-242987776111163/AnsiballZ_command.py && sleep 0' 34589 1727204139.84271: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204139.84289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34589 1727204139.84312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34589 1727204139.84325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 34589 1727204139.84332: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204139.84340: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 34589 1727204139.84347: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 34589 1727204139.84387: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34589 1727204139.84422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204139.84436: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204139.84534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204140.30184: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 6658 0 --:--:-- --:--:-- --:--:-- 6777\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1349 0 --:--:-- --:--:-- --:--:-- 1347", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:55:40.008796", "end": "2024-09-24 14:55:40.295454", "delta": "0:00:00.286658", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 34589 1727204140.31544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 34589 1727204140.31574: stderr chunk (state=3): >>><<< 34589 1727204140.31587: stdout chunk (state=3): >>><<< 34589 1727204140.31922: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 6658 0 --:--:-- --:--:-- --:--:-- 6777\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1349 0 --:--:-- --:--:-- --:--:-- 1347", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:55:40.008796", "end": "2024-09-24 14:55:40.295454", "delta": "0:00:00.286658", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 34589 1727204140.31932: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204139.717571-37769-242987776111163/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34589 1727204140.31934: _low_level_execute_command(): starting 34589 1727204140.31937: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204139.717571-37769-242987776111163/ > /dev/null 2>&1 && sleep 0' 34589 1727204140.32866: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34589 1727204140.33179: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 34589 1727204140.33202: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34589 1727204140.33315: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34589 1727204140.35285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34589 1727204140.35567: stderr chunk (state=3): >>><<< 34589 1727204140.35571: stdout chunk (state=3): >>><<< 34589 1727204140.35783: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34589 1727204140.35786: handler run complete 34589 1727204140.35788: Evaluated conditional (False): False 34589 1727204140.35790: attempt loop complete, returning result 34589 1727204140.35791: _execute() done 34589 1727204140.35793: dumping result to json 34589 1727204140.35795: done dumping result, returning 34589 1727204140.35796: done running TaskExecutor() for managed-node1/TASK: Verify DNS and network connectivity [028d2410-947f-a9c6-cddc-0000000006a0] 34589 1727204140.35798: sending task result for task 028d2410-947f-a9c6-cddc-0000000006a0 34589 1727204140.35862: done sending task result for task 028d2410-947f-a9c6-cddc-0000000006a0 34589 1727204140.35865: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.286658", "end": "2024-09-24 14:55:40.295454", "rc": 0, "start": "2024-09-24 14:55:40.008796" } STDOUT: CHECK DNS AND CONNECTIVITY 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 6658 0 --:--:-- --:--:-- --:--:-- 6777 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 1349 0 --:--:-- --:--:-- --:--:-- 1347 34589 1727204140.35955: no more pending results, returning what we have 34589 1727204140.35957: results queue empty 34589 1727204140.35958: checking for any_errors_fatal 34589 1727204140.35967: done checking for any_errors_fatal 34589 1727204140.35967: checking for max_fail_percentage 34589 1727204140.35969: done checking for max_fail_percentage 34589 1727204140.35970: checking to see if all hosts have failed and the running result is not ok 34589 1727204140.35971: done checking to see if all hosts have failed 34589 1727204140.35971: getting the remaining hosts for this loop 34589 1727204140.35972: done getting the remaining hosts for this loop 34589 1727204140.35978: getting the next task for host managed-node1 34589 1727204140.35986: done getting next task for host managed-node1 34589 1727204140.35992: ^ task is: TASK: meta (flush_handlers) 34589 1727204140.35994: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204140.35998: getting variables 34589 1727204140.36000: in VariableManager get_vars() 34589 1727204140.36034: Calling all_inventory to load vars for managed-node1 34589 1727204140.36037: Calling groups_inventory to load vars for managed-node1 34589 1727204140.36041: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204140.36052: Calling all_plugins_play to load vars for managed-node1 34589 1727204140.36056: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204140.36059: Calling groups_plugins_play to load vars for managed-node1 34589 1727204140.43197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204140.45284: done with get_vars() 34589 1727204140.45310: done getting variables 34589 1727204140.45364: in VariableManager get_vars() 34589 1727204140.45381: Calling all_inventory to load vars for managed-node1 34589 1727204140.45383: Calling groups_inventory to load vars for managed-node1 34589 1727204140.45386: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204140.45391: Calling all_plugins_play to load vars for managed-node1 34589 1727204140.45393: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204140.45396: Calling groups_plugins_play to load vars for managed-node1 34589 1727204140.46688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204140.48673: done with get_vars() 34589 1727204140.48835: done queuing things up, now waiting for results queue to drain 34589 1727204140.48837: results queue empty 34589 1727204140.48838: checking for any_errors_fatal 34589 1727204140.48843: done checking for any_errors_fatal 34589 1727204140.48843: checking for max_fail_percentage 34589 1727204140.48844: done checking for max_fail_percentage 34589 1727204140.48845: checking to see if all hosts have failed and the running result is not ok 34589 1727204140.48846: done checking to see if all hosts have failed 34589 1727204140.48847: getting the remaining hosts for this loop 34589 1727204140.48848: done getting the remaining hosts for this loop 34589 1727204140.48851: getting the next task for host managed-node1 34589 1727204140.48855: done getting next task for host managed-node1 34589 1727204140.48856: ^ task is: TASK: meta (flush_handlers) 34589 1727204140.48858: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204140.48861: getting variables 34589 1727204140.48862: in VariableManager get_vars() 34589 1727204140.48871: Calling all_inventory to load vars for managed-node1 34589 1727204140.48873: Calling groups_inventory to load vars for managed-node1 34589 1727204140.48882: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204140.48888: Calling all_plugins_play to load vars for managed-node1 34589 1727204140.48891: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204140.48894: Calling groups_plugins_play to load vars for managed-node1 34589 1727204140.50063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204140.51727: done with get_vars() 34589 1727204140.51749: done getting variables 34589 1727204140.51803: in VariableManager get_vars() 34589 1727204140.51813: Calling all_inventory to load vars for managed-node1 34589 1727204140.51816: Calling groups_inventory to load vars for managed-node1 34589 1727204140.51823: Calling all_plugins_inventory to load vars for managed-node1 34589 1727204140.51829: Calling all_plugins_play to load vars for managed-node1 34589 1727204140.51831: Calling groups_plugins_inventory to load vars for managed-node1 34589 1727204140.51834: Calling groups_plugins_play to load vars for managed-node1 34589 1727204140.53329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34589 1727204140.55015: done with get_vars() 34589 1727204140.55039: done queuing things up, now waiting for results queue to drain 34589 1727204140.55042: results queue empty 34589 1727204140.55042: checking for any_errors_fatal 34589 1727204140.55044: done checking for any_errors_fatal 34589 1727204140.55044: checking for max_fail_percentage 34589 1727204140.55045: done checking for max_fail_percentage 34589 1727204140.55046: checking to see if all hosts have failed and the running result is not ok 34589 1727204140.55047: done checking to see if all hosts have failed 34589 1727204140.55047: getting the remaining hosts for this loop 34589 1727204140.55048: done getting the remaining hosts for this loop 34589 1727204140.55051: getting the next task for host managed-node1 34589 1727204140.55055: done getting next task for host managed-node1 34589 1727204140.55055: ^ task is: None 34589 1727204140.55057: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34589 1727204140.55058: done queuing things up, now waiting for results queue to drain 34589 1727204140.55059: results queue empty 34589 1727204140.55060: checking for any_errors_fatal 34589 1727204140.55061: done checking for any_errors_fatal 34589 1727204140.55061: checking for max_fail_percentage 34589 1727204140.55062: done checking for max_fail_percentage 34589 1727204140.55063: checking to see if all hosts have failed and the running result is not ok 34589 1727204140.55064: done checking to see if all hosts have failed 34589 1727204140.55138: getting the next task for host managed-node1 34589 1727204140.55142: done getting next task for host managed-node1 34589 1727204140.55143: ^ task is: None 34589 1727204140.55145: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node1 : ok=75 changed=2 unreachable=0 failed=0 skipped=75 rescued=0 ignored=1 Tuesday 24 September 2024 14:55:40 -0400 (0:00:00.912) 0:00:40.686 ***** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.20s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.16s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.05s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which packages are installed --- 1.70s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.60s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml:6 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.29s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Create veth interface ethtest0 ------------------------------------------ 1.20s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Gathering Facts --------------------------------------------------------- 1.13s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:80 Gathering Facts --------------------------------------------------------- 1.10s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Gathering Facts --------------------------------------------------------- 1.02s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Gathering Facts --------------------------------------------------------- 1.00s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.94s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Install iproute --------------------------------------------------------- 0.92s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Verify DNS and network connectivity ------------------------------------- 0.91s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.84s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Gather the minimum subset of ansible_facts required by the network role test --- 0.81s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.77s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.77s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Check if system is ostree ----------------------------------------------- 0.75s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.74s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 34589 1727204140.55242: RUNNING CLEANUP