19285 1727203900.13635: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-bGV executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 19285 1727203900.14688: Added group all to inventory 19285 1727203900.14690: Added group ungrouped to inventory 19285 1727203900.14695: Group all now contains ungrouped 19285 1727203900.14698: Examining possible inventory source: /tmp/network-zt6/inventory-rSl.yml 19285 1727203900.42197: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 19285 1727203900.42259: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 19285 1727203900.42491: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 19285 1727203900.42554: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 19285 1727203900.42745: Loaded config def from plugin (inventory/script) 19285 1727203900.42747: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 19285 1727203900.42893: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 19285 1727203900.42992: Loaded config def from plugin (inventory/yaml) 19285 1727203900.42995: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 19285 1727203900.43266: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 19285 1727203900.44188: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 19285 1727203900.44192: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 19285 1727203900.44195: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 19285 1727203900.44201: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 19285 1727203900.44206: Loading data from /tmp/network-zt6/inventory-rSl.yml 19285 1727203900.44390: /tmp/network-zt6/inventory-rSl.yml was not parsable by auto 19285 1727203900.44533: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 19285 1727203900.44580: Loading data from /tmp/network-zt6/inventory-rSl.yml 19285 1727203900.44755: group all already in inventory 19285 1727203900.44763: set inventory_file for managed-node1 19285 1727203900.44768: set inventory_dir for managed-node1 19285 1727203900.44769: Added host managed-node1 to inventory 19285 1727203900.44771: Added host managed-node1 to group all 19285 1727203900.44772: set ansible_host for managed-node1 19285 1727203900.44773: set ansible_ssh_extra_args for managed-node1 19285 1727203900.44781: set inventory_file for managed-node2 19285 1727203900.44785: set inventory_dir for managed-node2 19285 1727203900.44785: Added host managed-node2 to inventory 19285 1727203900.44787: Added host managed-node2 to group all 19285 1727203900.44788: set ansible_host for managed-node2 19285 1727203900.44789: set ansible_ssh_extra_args for managed-node2 19285 1727203900.44792: set inventory_file for managed-node3 19285 1727203900.44794: set inventory_dir for managed-node3 19285 1727203900.44795: Added host managed-node3 to inventory 19285 1727203900.44796: Added host managed-node3 to group all 19285 1727203900.44797: set ansible_host for managed-node3 19285 1727203900.44798: set ansible_ssh_extra_args for managed-node3 19285 1727203900.44800: Reconcile groups and hosts in inventory. 19285 1727203900.44804: Group ungrouped now contains managed-node1 19285 1727203900.44806: Group ungrouped now contains managed-node2 19285 1727203900.44808: Group ungrouped now contains managed-node3 19285 1727203900.44899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 19285 1727203900.45030: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 19285 1727203900.45082: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 19285 1727203900.45115: Loaded config def from plugin (vars/host_group_vars) 19285 1727203900.45118: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 19285 1727203900.45125: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 19285 1727203900.45132: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 19285 1727203900.45174: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 19285 1727203900.45568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203900.45690: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 19285 1727203900.45729: Loaded config def from plugin (connection/local) 19285 1727203900.45734: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 19285 1727203900.46460: Loaded config def from plugin (connection/paramiko_ssh) 19285 1727203900.46463: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 19285 1727203900.47413: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 19285 1727203900.47453: Loaded config def from plugin (connection/psrp) 19285 1727203900.47455: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 19285 1727203900.48203: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 19285 1727203900.48242: Loaded config def from plugin (connection/ssh) 19285 1727203900.48245: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 19285 1727203900.50230: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 19285 1727203900.50270: Loaded config def from plugin (connection/winrm) 19285 1727203900.50274: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 19285 1727203900.50309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 19285 1727203900.50383: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 19285 1727203900.50458: Loaded config def from plugin (shell/cmd) 19285 1727203900.50460: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 19285 1727203900.50490: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 19285 1727203900.50556: Loaded config def from plugin (shell/powershell) 19285 1727203900.50558: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 19285 1727203900.50608: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 19285 1727203900.50794: Loaded config def from plugin (shell/sh) 19285 1727203900.50796: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 19285 1727203900.50826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 19285 1727203900.50949: Loaded config def from plugin (become/runas) 19285 1727203900.50951: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 19285 1727203900.51456: Loaded config def from plugin (become/su) 19285 1727203900.51458: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 19285 1727203900.51835: Loaded config def from plugin (become/sudo) 19285 1727203900.51837: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 19285 1727203900.51869: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml 19285 1727203900.52595: in VariableManager get_vars() 19285 1727203900.52617: done with get_vars() 19285 1727203900.52856: trying /usr/local/lib/python3.12/site-packages/ansible/modules 19285 1727203900.59721: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 19285 1727203900.60253: in VariableManager get_vars() 19285 1727203900.60258: done with get_vars() 19285 1727203900.60261: variable 'playbook_dir' from source: magic vars 19285 1727203900.60262: variable 'ansible_playbook_python' from source: magic vars 19285 1727203900.60263: variable 'ansible_config_file' from source: magic vars 19285 1727203900.60263: variable 'groups' from source: magic vars 19285 1727203900.60264: variable 'omit' from source: magic vars 19285 1727203900.60265: variable 'ansible_version' from source: magic vars 19285 1727203900.60266: variable 'ansible_check_mode' from source: magic vars 19285 1727203900.60266: variable 'ansible_diff_mode' from source: magic vars 19285 1727203900.60267: variable 'ansible_forks' from source: magic vars 19285 1727203900.60268: variable 'ansible_inventory_sources' from source: magic vars 19285 1727203900.60268: variable 'ansible_skip_tags' from source: magic vars 19285 1727203900.60269: variable 'ansible_limit' from source: magic vars 19285 1727203900.60270: variable 'ansible_run_tags' from source: magic vars 19285 1727203900.60270: variable 'ansible_verbosity' from source: magic vars 19285 1727203900.60308: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml 19285 1727203900.61531: in VariableManager get_vars() 19285 1727203900.61548: done with get_vars() 19285 1727203900.61987: in VariableManager get_vars() 19285 1727203900.62001: done with get_vars() 19285 1727203900.62033: in VariableManager get_vars() 19285 1727203900.62045: done with get_vars() 19285 1727203900.62119: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 19285 1727203900.62738: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 19285 1727203900.63270: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 19285 1727203900.65128: in VariableManager get_vars() 19285 1727203900.65149: done with get_vars() 19285 1727203900.66229: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 19285 1727203900.66365: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 19285 1727203900.69716: in VariableManager get_vars() 19285 1727203900.69719: done with get_vars() 19285 1727203900.69722: variable 'playbook_dir' from source: magic vars 19285 1727203900.69723: variable 'ansible_playbook_python' from source: magic vars 19285 1727203900.69724: variable 'ansible_config_file' from source: magic vars 19285 1727203900.69724: variable 'groups' from source: magic vars 19285 1727203900.69725: variable 'omit' from source: magic vars 19285 1727203900.69726: variable 'ansible_version' from source: magic vars 19285 1727203900.69727: variable 'ansible_check_mode' from source: magic vars 19285 1727203900.69727: variable 'ansible_diff_mode' from source: magic vars 19285 1727203900.69728: variable 'ansible_forks' from source: magic vars 19285 1727203900.69729: variable 'ansible_inventory_sources' from source: magic vars 19285 1727203900.69729: variable 'ansible_skip_tags' from source: magic vars 19285 1727203900.69730: variable 'ansible_limit' from source: magic vars 19285 1727203900.69731: variable 'ansible_run_tags' from source: magic vars 19285 1727203900.69732: variable 'ansible_verbosity' from source: magic vars 19285 1727203900.69765: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 19285 1727203900.70066: in VariableManager get_vars() 19285 1727203900.70083: done with get_vars() 19285 1727203900.70117: in VariableManager get_vars() 19285 1727203900.70121: done with get_vars() 19285 1727203900.70123: variable 'playbook_dir' from source: magic vars 19285 1727203900.70124: variable 'ansible_playbook_python' from source: magic vars 19285 1727203900.70125: variable 'ansible_config_file' from source: magic vars 19285 1727203900.70126: variable 'groups' from source: magic vars 19285 1727203900.70127: variable 'omit' from source: magic vars 19285 1727203900.70127: variable 'ansible_version' from source: magic vars 19285 1727203900.70128: variable 'ansible_check_mode' from source: magic vars 19285 1727203900.70129: variable 'ansible_diff_mode' from source: magic vars 19285 1727203900.70129: variable 'ansible_forks' from source: magic vars 19285 1727203900.70130: variable 'ansible_inventory_sources' from source: magic vars 19285 1727203900.70131: variable 'ansible_skip_tags' from source: magic vars 19285 1727203900.70132: variable 'ansible_limit' from source: magic vars 19285 1727203900.70132: variable 'ansible_run_tags' from source: magic vars 19285 1727203900.70133: variable 'ansible_verbosity' from source: magic vars 19285 1727203900.70164: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 19285 1727203900.70431: in VariableManager get_vars() 19285 1727203900.70444: done with get_vars() 19285 1727203900.70495: in VariableManager get_vars() 19285 1727203900.70498: done with get_vars() 19285 1727203900.70500: variable 'playbook_dir' from source: magic vars 19285 1727203900.70501: variable 'ansible_playbook_python' from source: magic vars 19285 1727203900.70502: variable 'ansible_config_file' from source: magic vars 19285 1727203900.70503: variable 'groups' from source: magic vars 19285 1727203900.70503: variable 'omit' from source: magic vars 19285 1727203900.70504: variable 'ansible_version' from source: magic vars 19285 1727203900.70505: variable 'ansible_check_mode' from source: magic vars 19285 1727203900.70506: variable 'ansible_diff_mode' from source: magic vars 19285 1727203900.70506: variable 'ansible_forks' from source: magic vars 19285 1727203900.70512: variable 'ansible_inventory_sources' from source: magic vars 19285 1727203900.70513: variable 'ansible_skip_tags' from source: magic vars 19285 1727203900.70514: variable 'ansible_limit' from source: magic vars 19285 1727203900.70514: variable 'ansible_run_tags' from source: magic vars 19285 1727203900.70515: variable 'ansible_verbosity' from source: magic vars 19285 1727203900.70545: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 19285 1727203900.70815: in VariableManager get_vars() 19285 1727203900.70819: done with get_vars() 19285 1727203900.70821: variable 'playbook_dir' from source: magic vars 19285 1727203900.70822: variable 'ansible_playbook_python' from source: magic vars 19285 1727203900.70822: variable 'ansible_config_file' from source: magic vars 19285 1727203900.70823: variable 'groups' from source: magic vars 19285 1727203900.70824: variable 'omit' from source: magic vars 19285 1727203900.70824: variable 'ansible_version' from source: magic vars 19285 1727203900.70825: variable 'ansible_check_mode' from source: magic vars 19285 1727203900.70826: variable 'ansible_diff_mode' from source: magic vars 19285 1727203900.70827: variable 'ansible_forks' from source: magic vars 19285 1727203900.70827: variable 'ansible_inventory_sources' from source: magic vars 19285 1727203900.70828: variable 'ansible_skip_tags' from source: magic vars 19285 1727203900.70829: variable 'ansible_limit' from source: magic vars 19285 1727203900.70829: variable 'ansible_run_tags' from source: magic vars 19285 1727203900.70830: variable 'ansible_verbosity' from source: magic vars 19285 1727203900.70858: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 19285 1727203900.70925: in VariableManager get_vars() 19285 1727203900.70936: done with get_vars() 19285 1727203900.71181: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 19285 1727203900.71291: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 19285 1727203900.71577: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 19285 1727203900.72160: in VariableManager get_vars() 19285 1727203900.72383: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 19285 1727203900.75722: in VariableManager get_vars() 19285 1727203900.75737: done with get_vars() 19285 1727203900.75773: in VariableManager get_vars() 19285 1727203900.76179: done with get_vars() 19285 1727203900.76183: variable 'playbook_dir' from source: magic vars 19285 1727203900.76183: variable 'ansible_playbook_python' from source: magic vars 19285 1727203900.76184: variable 'ansible_config_file' from source: magic vars 19285 1727203900.76185: variable 'groups' from source: magic vars 19285 1727203900.76186: variable 'omit' from source: magic vars 19285 1727203900.76187: variable 'ansible_version' from source: magic vars 19285 1727203900.76187: variable 'ansible_check_mode' from source: magic vars 19285 1727203900.76188: variable 'ansible_diff_mode' from source: magic vars 19285 1727203900.76189: variable 'ansible_forks' from source: magic vars 19285 1727203900.76190: variable 'ansible_inventory_sources' from source: magic vars 19285 1727203900.76190: variable 'ansible_skip_tags' from source: magic vars 19285 1727203900.76191: variable 'ansible_limit' from source: magic vars 19285 1727203900.76192: variable 'ansible_run_tags' from source: magic vars 19285 1727203900.76193: variable 'ansible_verbosity' from source: magic vars 19285 1727203900.76227: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 19285 1727203900.76304: in VariableManager get_vars() 19285 1727203900.76316: done with get_vars() 19285 1727203900.76357: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 19285 1727203900.76863: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 19285 1727203900.77116: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 19285 1727203900.83262: in VariableManager get_vars() 19285 1727203900.83393: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 19285 1727203900.87240: in VariableManager get_vars() 19285 1727203900.87244: done with get_vars() 19285 1727203900.87247: variable 'playbook_dir' from source: magic vars 19285 1727203900.87248: variable 'ansible_playbook_python' from source: magic vars 19285 1727203900.87248: variable 'ansible_config_file' from source: magic vars 19285 1727203900.87249: variable 'groups' from source: magic vars 19285 1727203900.87250: variable 'omit' from source: magic vars 19285 1727203900.87251: variable 'ansible_version' from source: magic vars 19285 1727203900.87251: variable 'ansible_check_mode' from source: magic vars 19285 1727203900.87252: variable 'ansible_diff_mode' from source: magic vars 19285 1727203900.87253: variable 'ansible_forks' from source: magic vars 19285 1727203900.87254: variable 'ansible_inventory_sources' from source: magic vars 19285 1727203900.87254: variable 'ansible_skip_tags' from source: magic vars 19285 1727203900.87255: variable 'ansible_limit' from source: magic vars 19285 1727203900.87256: variable 'ansible_run_tags' from source: magic vars 19285 1727203900.87257: variable 'ansible_verbosity' from source: magic vars 19285 1727203900.87696: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 19285 1727203900.87766: in VariableManager get_vars() 19285 1727203900.87795: done with get_vars() 19285 1727203900.87830: in VariableManager get_vars() 19285 1727203900.87833: done with get_vars() 19285 1727203900.87835: variable 'playbook_dir' from source: magic vars 19285 1727203900.87836: variable 'ansible_playbook_python' from source: magic vars 19285 1727203900.87837: variable 'ansible_config_file' from source: magic vars 19285 1727203900.87837: variable 'groups' from source: magic vars 19285 1727203900.87838: variable 'omit' from source: magic vars 19285 1727203900.87839: variable 'ansible_version' from source: magic vars 19285 1727203900.87840: variable 'ansible_check_mode' from source: magic vars 19285 1727203900.87840: variable 'ansible_diff_mode' from source: magic vars 19285 1727203900.87841: variable 'ansible_forks' from source: magic vars 19285 1727203900.87842: variable 'ansible_inventory_sources' from source: magic vars 19285 1727203900.87843: variable 'ansible_skip_tags' from source: magic vars 19285 1727203900.87843: variable 'ansible_limit' from source: magic vars 19285 1727203900.87844: variable 'ansible_run_tags' from source: magic vars 19285 1727203900.87845: variable 'ansible_verbosity' from source: magic vars 19285 1727203900.88236: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 19285 1727203900.88308: in VariableManager get_vars() 19285 1727203900.88323: done with get_vars() 19285 1727203900.88390: in VariableManager get_vars() 19285 1727203900.88404: done with get_vars() 19285 1727203900.88866: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 19285 1727203900.88882: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 19285 1727203900.89345: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 19285 1727203900.89648: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 19285 1727203900.89655: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-bGV/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 19285 1727203900.89691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 19285 1727203900.89717: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 19285 1727203900.90281: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 19285 1727203900.90340: Loaded config def from plugin (callback/default) 19285 1727203900.90343: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 19285 1727203900.92282: Loaded config def from plugin (callback/junit) 19285 1727203900.92285: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 19285 1727203900.92332: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 19285 1727203900.92403: Loaded config def from plugin (callback/minimal) 19285 1727203900.92405: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 19285 1727203900.92442: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 19285 1727203900.92509: Loaded config def from plugin (callback/tree) 19285 1727203900.92512: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 19285 1727203900.92631: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 19285 1727203900.92634: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-bGV/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bridge_nm.yml ************************************************** 11 plays in /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml 19285 1727203900.92663: in VariableManager get_vars() 19285 1727203900.92686: done with get_vars() 19285 1727203900.92694: in VariableManager get_vars() 19285 1727203900.92704: done with get_vars() 19285 1727203900.92708: variable 'omit' from source: magic vars 19285 1727203900.92745: in VariableManager get_vars() 19285 1727203900.92759: done with get_vars() 19285 1727203900.92783: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bridge.yml' with nm as provider] *********** 19285 1727203900.93371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 19285 1727203900.93453: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 19285 1727203900.93488: getting the remaining hosts for this loop 19285 1727203900.93490: done getting the remaining hosts for this loop 19285 1727203900.93492: getting the next task for host managed-node2 19285 1727203900.93496: done getting next task for host managed-node2 19285 1727203900.93497: ^ task is: TASK: Gathering Facts 19285 1727203900.93499: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203900.93501: getting variables 19285 1727203900.93502: in VariableManager get_vars() 19285 1727203900.93513: Calling all_inventory to load vars for managed-node2 19285 1727203900.93515: Calling groups_inventory to load vars for managed-node2 19285 1727203900.93517: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203900.93528: Calling all_plugins_play to load vars for managed-node2 19285 1727203900.93539: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203900.93542: Calling groups_plugins_play to load vars for managed-node2 19285 1727203900.93590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203900.93641: done with get_vars() 19285 1727203900.93647: done getting variables 19285 1727203900.93715: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:6 Tuesday 24 September 2024 14:51:40 -0400 (0:00:00.011) 0:00:00.011 ***** 19285 1727203900.93736: entering _queue_task() for managed-node2/gather_facts 19285 1727203900.93737: Creating lock for gather_facts 19285 1727203900.94235: worker is 1 (out of 1 available) 19285 1727203900.94245: exiting _queue_task() for managed-node2/gather_facts 19285 1727203900.94258: done queuing things up, now waiting for results queue to drain 19285 1727203900.94260: waiting for pending results... 19285 1727203900.94445: running TaskExecutor() for managed-node2/TASK: Gathering Facts 19285 1727203900.94523: in run() - task 028d2410-947f-f31b-fb3f-00000000007e 19285 1727203900.94598: variable 'ansible_search_path' from source: unknown 19285 1727203900.94602: calling self._execute() 19285 1727203900.94664: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203900.94678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203900.94693: variable 'omit' from source: magic vars 19285 1727203900.94798: variable 'omit' from source: magic vars 19285 1727203900.94836: variable 'omit' from source: magic vars 19285 1727203900.94885: variable 'omit' from source: magic vars 19285 1727203900.94933: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203900.95083: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203900.95086: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203900.95089: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203900.95091: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203900.95093: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203900.95096: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203900.95099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203900.95200: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203900.95215: Set connection var ansible_pipelining to False 19285 1727203900.95223: Set connection var ansible_timeout to 10 19285 1727203900.95229: Set connection var ansible_shell_type to sh 19285 1727203900.95238: Set connection var ansible_shell_executable to /bin/sh 19285 1727203900.95243: Set connection var ansible_connection to ssh 19285 1727203900.95306: variable 'ansible_shell_executable' from source: unknown 19285 1727203900.95316: variable 'ansible_connection' from source: unknown 19285 1727203900.95325: variable 'ansible_module_compression' from source: unknown 19285 1727203900.95330: variable 'ansible_shell_type' from source: unknown 19285 1727203900.95336: variable 'ansible_shell_executable' from source: unknown 19285 1727203900.95342: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203900.95403: variable 'ansible_pipelining' from source: unknown 19285 1727203900.95406: variable 'ansible_timeout' from source: unknown 19285 1727203900.95408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203900.95551: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203900.95565: variable 'omit' from source: magic vars 19285 1727203900.95573: starting attempt loop 19285 1727203900.95582: running the handler 19285 1727203900.95602: variable 'ansible_facts' from source: unknown 19285 1727203900.95632: _low_level_execute_command(): starting 19285 1727203900.95650: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203900.96407: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203900.96496: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203900.96513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203900.96536: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203900.96578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203900.96654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203900.98363: stdout chunk (state=3): >>>/root <<< 19285 1727203900.98747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203900.98750: stdout chunk (state=3): >>><<< 19285 1727203900.98753: stderr chunk (state=3): >>><<< 19285 1727203900.98755: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203900.98758: _low_level_execute_command(): starting 19285 1727203900.98760: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203900.9862669-19378-30901698762761 `" && echo ansible-tmp-1727203900.9862669-19378-30901698762761="` echo /root/.ansible/tmp/ansible-tmp-1727203900.9862669-19378-30901698762761 `" ) && sleep 0' 19285 1727203901.00827: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203901.00831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 19285 1727203901.00834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203901.00843: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203901.00935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203901.01389: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203901.01467: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203901.03478: stdout chunk (state=3): >>>ansible-tmp-1727203900.9862669-19378-30901698762761=/root/.ansible/tmp/ansible-tmp-1727203900.9862669-19378-30901698762761 <<< 19285 1727203901.03881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203901.03885: stdout chunk (state=3): >>><<< 19285 1727203901.03887: stderr chunk (state=3): >>><<< 19285 1727203901.03890: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203900.9862669-19378-30901698762761=/root/.ansible/tmp/ansible-tmp-1727203900.9862669-19378-30901698762761 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203901.03892: variable 'ansible_module_compression' from source: unknown 19285 1727203901.03940: ANSIBALLZ: Using generic lock for ansible.legacy.setup 19285 1727203901.03988: ANSIBALLZ: Acquiring lock 19285 1727203901.03996: ANSIBALLZ: Lock acquired: 140487240913488 19285 1727203901.04005: ANSIBALLZ: Creating module 19285 1727203901.61906: ANSIBALLZ: Writing module into payload 19285 1727203901.62247: ANSIBALLZ: Writing module 19285 1727203901.62331: ANSIBALLZ: Renaming module 19285 1727203901.62526: ANSIBALLZ: Done creating module 19285 1727203901.62529: variable 'ansible_facts' from source: unknown 19285 1727203901.62532: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203901.62534: _low_level_execute_command(): starting 19285 1727203901.62537: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 19285 1727203901.63951: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203901.64230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203901.64242: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203901.64288: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203901.64411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203901.66147: stdout chunk (state=3): >>>PLATFORM <<< 19285 1727203901.66217: stdout chunk (state=3): >>>Linux <<< 19285 1727203901.66239: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 <<< 19285 1727203901.66364: stdout chunk (state=3): >>>/usr/bin/python3 ENDFOUND <<< 19285 1727203901.66404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203901.66440: stderr chunk (state=3): >>><<< 19285 1727203901.66449: stdout chunk (state=3): >>><<< 19285 1727203901.66495: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203901.66810 [managed-node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 19285 1727203901.66815: _low_level_execute_command(): starting 19285 1727203901.66817: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 19285 1727203901.66835: Sending initial data 19285 1727203901.66845: Sent initial data (1181 bytes) 19285 1727203901.67952: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203901.67966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 19285 1727203901.67980: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203901.68049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203901.68118: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203901.68224: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203901.68321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203901.72097: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 19285 1727203901.72255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203901.72296: stderr chunk (state=3): >>><<< 19285 1727203901.72309: stdout chunk (state=3): >>><<< 19285 1727203901.72343: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203901.72554: variable 'ansible_facts' from source: unknown 19285 1727203901.72563: variable 'ansible_facts' from source: unknown 19285 1727203901.72639: variable 'ansible_module_compression' from source: unknown 19285 1727203901.72689: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19285 1727203901.72737: variable 'ansible_facts' from source: unknown 19285 1727203901.73010: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203900.9862669-19378-30901698762761/AnsiballZ_setup.py 19285 1727203901.73527: Sending initial data 19285 1727203901.73531: Sent initial data (153 bytes) 19285 1727203901.74723: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203901.74811: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203901.74872: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203901.74895: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203901.75007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203901.75227: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203901.76834: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203901.76905: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203901.76991: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmpal75aa9l /root/.ansible/tmp/ansible-tmp-1727203900.9862669-19378-30901698762761/AnsiballZ_setup.py <<< 19285 1727203901.76994: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203900.9862669-19378-30901698762761/AnsiballZ_setup.py" <<< 19285 1727203901.77096: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmpal75aa9l" to remote "/root/.ansible/tmp/ansible-tmp-1727203900.9862669-19378-30901698762761/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203900.9862669-19378-30901698762761/AnsiballZ_setup.py" <<< 19285 1727203901.80832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203901.80837: stdout chunk (state=3): >>><<< 19285 1727203901.80839: stderr chunk (state=3): >>><<< 19285 1727203901.80842: done transferring module to remote 19285 1727203901.80844: _low_level_execute_command(): starting 19285 1727203901.80847: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203900.9862669-19378-30901698762761/ /root/.ansible/tmp/ansible-tmp-1727203900.9862669-19378-30901698762761/AnsiballZ_setup.py && sleep 0' 19285 1727203901.81646: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203901.81680: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203901.81696: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203901.81710: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203901.81812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19285 1727203901.84258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203901.84265: stdout chunk (state=3): >>><<< 19285 1727203901.84267: stderr chunk (state=3): >>><<< 19285 1727203901.84270: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 19285 1727203901.84272: _low_level_execute_command(): starting 19285 1727203901.84274: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203900.9862669-19378-30901698762761/AnsiballZ_setup.py && sleep 0' 19285 1727203901.85016: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203901.85021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203901.85089: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203901.85093: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203901.85096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203901.85146: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203901.85188: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203901.85285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19285 1727203901.88361: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook <<< 19285 1727203901.88368: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 19285 1727203901.88373: stdout chunk (state=3): >>>import '_codecs' # <<< 19285 1727203901.88432: stdout chunk (state=3): >>>import 'codecs' # <<< 19285 1727203901.88437: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 19285 1727203901.88461: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1946a184d0> <<< 19285 1727203901.88467: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19469e7b30> <<< 19285 1727203901.88581: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 19285 1727203901.88782: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1946a1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # <<< 19285 1727203901.88811: stdout chunk (state=3): >>>import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' <<< 19285 1727203901.88817: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 19285 1727203901.88828: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 19285 1727203901.89088: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19467e9130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19467ea060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 19285 1727203901.89423: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 19285 1727203901.89429: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 19285 1727203901.89445: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 19285 1727203901.89466: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 19285 1727203901.89523: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 19285 1727203901.89551: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 19285 1727203901.89568: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1946827e90> <<< 19285 1727203901.89636: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1946827f50> <<< 19285 1727203901.89647: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 19285 1727203901.89801: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 19285 1727203901.89804: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f194685f890> <<< 19285 1727203901.89830: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 19285 1727203901.89837: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 19285 1727203901.89849: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f194685ff20> <<< 19285 1727203901.89864: stdout chunk (state=3): >>>import '_collections' # <<< 19285 1727203901.89902: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f194683fb60> <<< 19285 1727203901.89955: stdout chunk (state=3): >>>import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f194683d280> <<< 19285 1727203901.90035: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1946825040> <<< 19285 1727203901.90066: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 19285 1727203901.90087: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 19285 1727203901.90198: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 19285 1727203901.90410: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1946883770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1946882390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f194683e120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1946826900> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19468b4830> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19468242c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19468b4ce0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19468b4b90> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 19285 1727203901.90551: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19468b4f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1946822de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19468b5640> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19468b5310> import 'importlib.machinery' # <<< 19285 1727203901.90571: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 19285 1727203901.90596: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19468b6510> <<< 19285 1727203901.90618: stdout chunk (state=3): >>>import 'importlib.util' # <<< 19285 1727203901.90644: stdout chunk (state=3): >>>import 'runpy' # <<< 19285 1727203901.90770: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19468cc710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19468cddc0> <<< 19285 1727203901.90870: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19468cec60> <<< 19285 1727203901.90891: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19468cf290> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19468ce1b0> <<< 19285 1727203901.90986: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19468cfd10> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19468cf440> <<< 19285 1727203901.91025: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19468b6480> <<< 19285 1727203901.91108: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 19285 1727203901.91195: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19465c3c80> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 19285 1727203901.91219: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19465ec7d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19465ec530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 19285 1727203901.91394: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19465ec710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 19285 1727203901.91471: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19465ed0a0> <<< 19285 1727203901.91621: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19465eda60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19465ec950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19465c1e20> <<< 19285 1727203901.91750: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19465eee40> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19465edb80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19468b6c30> <<< 19285 1727203901.91789: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 19285 1727203901.91831: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 19285 1727203901.91964: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19466171d0> <<< 19285 1727203901.91967: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 19285 1727203901.91986: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 19285 1727203901.92001: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 19285 1727203901.92014: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 19285 1727203901.92062: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f194663f560> <<< 19285 1727203901.92088: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 19285 1727203901.92169: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 19285 1727203901.92187: stdout chunk (state=3): >>>import 'ntpath' # <<< 19285 1727203901.92210: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f194669c2f0> <<< 19285 1727203901.92282: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 19285 1727203901.92321: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 19285 1727203901.92411: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f194669ea50> <<< 19285 1727203901.92492: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f194669c410> <<< 19285 1727203901.92531: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1946665340> <<< 19285 1727203901.92545: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945f25430> <<< 19285 1727203901.92567: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f194663e360> <<< 19285 1727203901.92592: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19465efda0> <<< 19285 1727203901.92758: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 19285 1727203901.92779: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f194663e480> <<< 19285 1727203901.93161: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_bl66z08m/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 19285 1727203901.93164: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203901.93187: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 19285 1727203901.93235: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 19285 1727203901.93424: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945f8b110> import '_typing' # <<< 19285 1727203901.93694: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945f6a000> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945f69160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 19285 1727203901.95032: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203901.96165: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945f88fb0> <<< 19285 1727203901.96193: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 19285 1727203901.96221: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 19285 1727203901.96274: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 19285 1727203901.96385: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945fbaab0> <<< 19285 1727203901.96398: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945fba840> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945fba150> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 19285 1727203901.96432: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945fbaba0> <<< 19285 1727203901.96436: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1946a1a9c0> import 'atexit' # <<< 19285 1727203901.96475: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945fbb830> <<< 19285 1727203901.96499: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 19285 1727203901.96593: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945fbba70> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 19285 1727203901.96639: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945fbbfb0> import 'pwd' # <<< 19285 1727203901.96658: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 19285 1727203901.96740: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e25cd0> <<< 19285 1727203901.96756: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945e278f0> <<< 19285 1727203901.96810: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 19285 1727203901.97173: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e282f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e29490> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 19285 1727203901.97183: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e2bf50> <<< 19285 1727203901.97186: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1946a1bf20> <<< 19285 1727203901.97202: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e2a210> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 19285 1727203901.97335: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 19285 1727203901.97338: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 19285 1727203901.97341: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 19285 1727203901.97343: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e33d70> <<< 19285 1727203901.97721: stdout chunk (state=3): >>>import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e32840> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e325d0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e32b10> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e2a720> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945e77a10> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e781a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945e79be0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e799a0> <<< 19285 1727203901.97823: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 19285 1727203901.97827: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945e7c140> <<< 19285 1727203901.97832: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e7a2d0> <<< 19285 1727203901.97980: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 19285 1727203901.98019: stdout chunk (state=3): >>>import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e7f920> <<< 19285 1727203901.98127: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e7c2f0> <<< 19285 1727203901.98162: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 19285 1727203901.98169: stdout chunk (state=3): >>> # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945e80a10> <<< 19285 1727203901.98814: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945e809b0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945e80c80> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e782c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945d0c1a0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945d0d370> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e82930> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945e83ce0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e825a0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 19285 1727203901.98828: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 19285 1727203901.98844: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 19285 1727203901.98856: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203901.98984: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203901.99097: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203901.99633: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.00234: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 19285 1727203902.00296: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945d11580> <<< 19285 1727203902.00358: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 19285 1727203902.00564: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945d12330> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945d0d550> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 19285 1727203902.00655: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.00891: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945d122d0> # zipimport: zlib available <<< 19285 1727203902.01283: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.01897: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 19285 1727203902.01916: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.01948: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 19285 1727203902.01962: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.02028: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.02299: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 19285 1727203902.02468: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.02769: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 19285 1727203902.02836: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945d13470> <<< 19285 1727203902.02994: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 19285 1727203902.03092: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available <<< 19285 1727203902.03111: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 19285 1727203902.03199: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 19285 1727203902.03308: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 19285 1727203902.03340: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 19285 1727203902.03373: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 19285 1727203902.03595: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945d1e000> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945d196d0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 19285 1727203902.03620: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 19285 1727203902.03680: stdout chunk (state=3): >>># zipimport: zlib available<<< 19285 1727203902.03686: stdout chunk (state=3): >>> <<< 19285 1727203902.03701: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.03754: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 19285 1727203902.03960: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 19285 1727203902.03978: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e069f0> <<< 19285 1727203902.04018: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945efa6c0> <<< 19285 1727203902.04096: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945d1e1e0> <<< 19285 1727203902.04386: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e80e60> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 19285 1727203902.04419: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 19285 1727203902.04422: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.04516: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 19285 1727203902.04622: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 19285 1727203902.04699: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.04839: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 19285 1727203902.05294: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 19285 1727203902.05316: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 19285 1727203902.05323: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 19285 1727203902.05610: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945db2090> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19459ebf50> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19459f0440> <<< 19285 1727203902.05793: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945d9b170> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945db2c00> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945db0770> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945db03b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 19285 1727203902.05821: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 19285 1727203902.05824: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 19285 1727203902.05887: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 19285 1727203902.05891: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 19285 1727203902.05895: stdout chunk (state=3): >>>import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19459f32f0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19459f2ba0> <<< 19285 1727203902.06078: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19459f2d80> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19459f2000> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 19285 1727203902.06086: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 19285 1727203902.06108: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19459f3470> <<< 19285 1727203902.06111: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 19285 1727203902.06301: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945a55f70> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19459f3f80> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945db04d0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 19285 1727203902.06305: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.06444: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 19285 1727203902.06892: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 19285 1727203902.07008: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 19285 1727203902.07029: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.07092: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 19285 1727203902.07110: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.07696: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.08097: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available <<< 19285 1727203902.08149: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.08223: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 19285 1727203902.08247: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.08531: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available <<< 19285 1727203902.08736: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available <<< 19285 1727203902.08764: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945a560f0> <<< 19285 1727203902.08785: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 19285 1727203902.08847: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 19285 1727203902.08997: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945a56cf0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 19285 1727203902.09090: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 19285 1727203902.09377: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available <<< 19285 1727203902.09419: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 19285 1727203902.09422: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.09494: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.09599: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 19285 1727203902.09602: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 19285 1727203902.09635: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 19285 1727203902.09685: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 19285 1727203902.09727: stdout chunk (state=3): >>>import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945a8e2a0> <<< 19285 1727203902.09963: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945a7e0f0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 19285 1727203902.09967: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.10008: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 19285 1727203902.10022: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.10211: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 19285 1727203902.10302: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.10590: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available <<< 19285 1727203902.10642: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 19285 1727203902.10648: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 19285 1727203902.10739: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945aa5d60> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945aa5cd0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 19285 1727203902.10743: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 19285 1727203902.10822: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.10843: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # <<< 19285 1727203902.10856: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.11081: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.11177: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 19285 1727203902.11268: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.11357: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.11604: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 19285 1727203902.11698: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.11785: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 19285 1727203902.11980: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.11984: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.12296: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 19285 1727203902.12682: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.13181: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 19285 1727203902.13195: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.13434: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.13451: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 19285 1727203902.13506: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.13690: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 19285 1727203902.14043: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available <<< 19285 1727203902.14098: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 19285 1727203902.14301: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 19285 1727203902.14455: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.14699: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 19285 1727203902.14712: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.14995: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available <<< 19285 1727203902.15086: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available <<< 19285 1727203902.15338: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 19285 1727203902.15531: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.15896: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available <<< 19285 1727203902.15923: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 19285 1727203902.15926: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.15961: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.16231: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 19285 1727203902.16283: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.16445: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 19285 1727203902.16477: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available <<< 19285 1727203902.16769: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 19285 1727203902.16988: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 19285 1727203902.17065: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.17323: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available <<< 19285 1727203902.17351: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 19285 1727203902.17365: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.17416: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.17523: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 19285 1727203902.17744: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available <<< 19285 1727203902.17829: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 19285 1727203902.17973: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203902.18211: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f194583a570> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f194583bdd0> <<< 19285 1727203902.18326: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945833da0> <<< 19285 1727203902.29252: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 19285 1727203902.29277: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945882ed0> <<< 19285 1727203902.29572: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19458811f0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19458834d0> <<< 19285 1727203902.29657: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945882150> <<< 19285 1727203902.30105: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 19285 1727203902.55940: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keyt<<< 19285 1727203902.55970: stdout chunk (state=3): >>>ype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "51", "second": "42", "epoch": "1727203902", "epoch_int": "1727203902", "date": "2024-09-24", "time": "14:51:42", "iso8601_micro": "2024-09-24T18:51:42.192506Z", "iso8601": "2024-09-24T18:51:42Z", "iso8601_basic": "20240924T145142192506", "iso8601_basic_short": "20240924T145142", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.51025390625, "5m": 0.38671875, "15m": 0.19287109375}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2937, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 594, "free": 2937}, "nocache": {"free": 3293, "used": 238}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amaz<<< 19285 1727203902.56074: stdout chunk (state=3): >>>on", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 488, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261788266496, "block_size": 4096, "block_total": 65519099, "block_available": 63913151, "block_used": 1605948, "inode_total": 131070960, "inode_available": 131027263, "inode_used": 43697, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19285 1727203902.56807: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value<<< 19285 1727203902.56930: stdout chunk (state=3): >>> # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings<<< 19285 1727203902.56957: stdout chunk (state=3): >>> # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg <<< 19285 1727203902.56961: stdout chunk (state=3): >>># cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy<<< 19285 1727203902.57005: stdout chunk (state=3): >>> # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib<<< 19285 1727203902.57022: stdout chunk (state=3): >>> # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading<<< 19285 1727203902.57161: stdout chunk (state=3): >>> # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils <<< 19285 1727203902.57205: stdout chunk (state=3): >>># cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves<<< 19285 1727203902.57230: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections<<< 19285 1727203902.57233: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool<<< 19285 1727203902.57595: stdout chunk (state=3): >>> # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux <<< 19285 1727203902.57684: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata<<< 19285 1727203902.57706: stdout chunk (state=3): >>> # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 19285 1727203902.58227: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 19285 1727203902.58251: stdout chunk (state=3): >>># destroy importlib.machinery<<< 19285 1727203902.58284: stdout chunk (state=3): >>> # destroy importlib._abc # destroy importlib.util # destroy _bz2<<< 19285 1727203902.58299: stdout chunk (state=3): >>> # destroy _compression<<< 19285 1727203902.58364: stdout chunk (state=3): >>> # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path<<< 19285 1727203902.58399: stdout chunk (state=3): >>> # destroy zipfile<<< 19285 1727203902.58402: stdout chunk (state=3): >>> <<< 19285 1727203902.58478: stdout chunk (state=3): >>># destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath <<< 19285 1727203902.58569: stdout chunk (state=3): >>># destroy importlib <<< 19285 1727203902.58573: stdout chunk (state=3): >>># destroy zipimport<<< 19285 1727203902.58693: stdout chunk (state=3): >>> # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro<<< 19285 1727203902.58713: stdout chunk (state=3): >>> # destroy argparse <<< 19285 1727203902.59001: stdout chunk (state=3): >>># destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime<<< 19285 1727203902.59016: stdout chunk (state=3): >>> # destroy subprocess # destroy base64<<< 19285 1727203902.59052: stdout chunk (state=3): >>> # destroy _ssl<<< 19285 1727203902.59114: stdout chunk (state=3): >>> # destroy ansible.module_utils.compat.selinux<<< 19285 1727203902.59210: stdout chunk (state=3): >>> # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket <<< 19285 1727203902.59224: stdout chunk (state=3): >>># destroy struct # destroy glob<<< 19285 1727203902.59227: stdout chunk (state=3): >>> <<< 19285 1727203902.59254: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile<<< 19285 1727203902.59277: stdout chunk (state=3): >>> # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array<<< 19285 1727203902.59503: stdout chunk (state=3): >>> # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading<<< 19285 1727203902.59595: stdout chunk (state=3): >>> # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re<<< 19285 1727203902.59598: stdout chunk (state=3): >>> # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser<<< 19285 1727203902.59704: stdout chunk (state=3): >>> # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat<<< 19285 1727203902.59710: stdout chunk (state=3): >>> # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs<<< 19285 1727203902.59713: stdout chunk (state=3): >>> # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix<<< 19285 1727203902.59726: stdout chunk (state=3): >>> # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread<<< 19285 1727203902.59779: stdout chunk (state=3): >>> # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux<<< 19285 1727203902.59952: stdout chunk (state=3): >>> # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 19285 1727203902.60209: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize<<< 19285 1727203902.60268: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib <<< 19285 1727203902.60271: stdout chunk (state=3): >>># destroy copyreg <<< 19285 1727203902.60338: stdout chunk (state=3): >>># destroy contextlib # destroy _typing<<< 19285 1727203902.60341: stdout chunk (state=3): >>> <<< 19285 1727203902.60389: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response<<< 19285 1727203902.60602: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8<<< 19285 1727203902.60842: stdout chunk (state=3): >>> # destroy encodings.utf_8_sig # destroy encodings.cp437<<< 19285 1727203902.60846: stdout chunk (state=3): >>> # destroy encodings.idna<<< 19285 1727203902.60848: stdout chunk (state=3): >>> # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib<<< 19285 1727203902.60851: stdout chunk (state=3): >>> <<< 19285 1727203902.60884: stdout chunk (state=3): >>># destroy _operator<<< 19285 1727203902.61200: stdout chunk (state=3): >>> # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 19285 1727203902.61683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203902.61748: stdout chunk (state=3): >>><<< 19285 1727203902.61751: stderr chunk (state=3): >>><<< 19285 1727203902.62518: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1946a184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19469e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1946a1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19467e9130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19467ea060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1946827e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1946827f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f194685f890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f194685ff20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f194683fb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f194683d280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1946825040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1946883770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1946882390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f194683e120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1946826900> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19468b4830> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19468242c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19468b4ce0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19468b4b90> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19468b4f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1946822de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19468b5640> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19468b5310> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19468b6510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19468cc710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19468cddc0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19468cec60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19468cf290> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19468ce1b0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19468cfd10> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19468cf440> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19468b6480> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19465c3c80> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19465ec7d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19465ec530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19465ec710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19465ed0a0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19465eda60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19465ec950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19465c1e20> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19465eee40> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19465edb80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19468b6c30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19466171d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f194663f560> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f194669c2f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f194669ea50> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f194669c410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1946665340> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945f25430> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f194663e360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19465efda0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f194663e480> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_bl66z08m/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945f8b110> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945f6a000> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945f69160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945f88fb0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945fbaab0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945fba840> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945fba150> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945fbaba0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1946a1a9c0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945fbb830> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945fbba70> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945fbbfb0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e25cd0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945e278f0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e282f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e29490> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e2bf50> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1946a1bf20> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e2a210> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e33d70> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e32840> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e325d0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e32b10> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e2a720> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945e77a10> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e781a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945e79be0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e799a0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945e7c140> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e7a2d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e7f920> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e7c2f0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945e80a10> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945e809b0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945e80c80> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e782c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945d0c1a0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945d0d370> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e82930> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945e83ce0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e825a0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945d11580> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945d12330> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945d0d550> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945d122d0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945d13470> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945d1e000> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945d196d0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e069f0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945efa6c0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945d1e1e0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945e80e60> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945db2090> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19459ebf50> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19459f0440> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945d9b170> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945db2c00> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945db0770> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945db03b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19459f32f0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19459f2ba0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f19459f2d80> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19459f2000> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19459f3470> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945a55f70> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19459f3f80> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945db04d0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945a560f0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945a56cf0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945a8e2a0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945a7e0f0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1945aa5d60> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945aa5cd0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f194583a570> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f194583bdd0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945833da0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945882ed0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19458811f0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f19458834d0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1945882150> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "51", "second": "42", "epoch": "1727203902", "epoch_int": "1727203902", "date": "2024-09-24", "time": "14:51:42", "iso8601_micro": "2024-09-24T18:51:42.192506Z", "iso8601": "2024-09-24T18:51:42Z", "iso8601_basic": "20240924T145142192506", "iso8601_basic_short": "20240924T145142", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.51025390625, "5m": 0.38671875, "15m": 0.19287109375}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2937, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 594, "free": 2937}, "nocache": {"free": 3293, "used": 238}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 488, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261788266496, "block_size": 4096, "block_total": 65519099, "block_available": 63913151, "block_used": 1605948, "inode_total": 131070960, "inode_available": 131027263, "inode_used": 43697, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 19285 1727203902.67083: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203900.9862669-19378-30901698762761/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203902.67204: _low_level_execute_command(): starting 19285 1727203902.67208: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203900.9862669-19378-30901698762761/ > /dev/null 2>&1 && sleep 0' 19285 1727203902.68468: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203902.68472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203902.68488: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203902.68740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203902.69043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19285 1727203902.70913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203902.70918: stdout chunk (state=3): >>><<< 19285 1727203902.70920: stderr chunk (state=3): >>><<< 19285 1727203902.70922: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 19285 1727203902.70924: handler run complete 19285 1727203902.71146: variable 'ansible_facts' from source: unknown 19285 1727203902.71401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203902.72038: variable 'ansible_facts' from source: unknown 19285 1727203902.72224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203902.72447: attempt loop complete, returning result 19285 1727203902.72609: _execute() done 19285 1727203902.72612: dumping result to json 19285 1727203902.72614: done dumping result, returning 19285 1727203902.72618: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [028d2410-947f-f31b-fb3f-00000000007e] 19285 1727203902.72626: sending task result for task 028d2410-947f-f31b-fb3f-00000000007e ok: [managed-node2] 19285 1727203902.74493: no more pending results, returning what we have 19285 1727203902.74496: results queue empty 19285 1727203902.74497: checking for any_errors_fatal 19285 1727203902.74499: done checking for any_errors_fatal 19285 1727203902.74499: checking for max_fail_percentage 19285 1727203902.74501: done checking for max_fail_percentage 19285 1727203902.74502: checking to see if all hosts have failed and the running result is not ok 19285 1727203902.74502: done checking to see if all hosts have failed 19285 1727203902.74503: getting the remaining hosts for this loop 19285 1727203902.74505: done getting the remaining hosts for this loop 19285 1727203902.74508: getting the next task for host managed-node2 19285 1727203902.74514: done getting next task for host managed-node2 19285 1727203902.74516: ^ task is: TASK: meta (flush_handlers) 19285 1727203902.74518: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203902.74521: getting variables 19285 1727203902.74522: in VariableManager get_vars() 19285 1727203902.74542: Calling all_inventory to load vars for managed-node2 19285 1727203902.74544: Calling groups_inventory to load vars for managed-node2 19285 1727203902.74547: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203902.74555: Calling all_plugins_play to load vars for managed-node2 19285 1727203902.74557: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203902.74560: Calling groups_plugins_play to load vars for managed-node2 19285 1727203902.75098: done sending task result for task 028d2410-947f-f31b-fb3f-00000000007e 19285 1727203902.75102: WORKER PROCESS EXITING 19285 1727203902.75550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203902.76571: done with get_vars() 19285 1727203902.76588: done getting variables 19285 1727203902.76708: in VariableManager get_vars() 19285 1727203902.76719: Calling all_inventory to load vars for managed-node2 19285 1727203902.76721: Calling groups_inventory to load vars for managed-node2 19285 1727203902.76724: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203902.76728: Calling all_plugins_play to load vars for managed-node2 19285 1727203902.76731: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203902.76734: Calling groups_plugins_play to load vars for managed-node2 19285 1727203902.76880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203902.77105: done with get_vars() 19285 1727203902.77118: done queuing things up, now waiting for results queue to drain 19285 1727203902.77120: results queue empty 19285 1727203902.77121: checking for any_errors_fatal 19285 1727203902.77124: done checking for any_errors_fatal 19285 1727203902.77124: checking for max_fail_percentage 19285 1727203902.77125: done checking for max_fail_percentage 19285 1727203902.77126: checking to see if all hosts have failed and the running result is not ok 19285 1727203902.77127: done checking to see if all hosts have failed 19285 1727203902.77137: getting the remaining hosts for this loop 19285 1727203902.77138: done getting the remaining hosts for this loop 19285 1727203902.77141: getting the next task for host managed-node2 19285 1727203902.77145: done getting next task for host managed-node2 19285 1727203902.77148: ^ task is: TASK: Include the task 'el_repo_setup.yml' 19285 1727203902.77149: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203902.77151: getting variables 19285 1727203902.77152: in VariableManager get_vars() 19285 1727203902.77400: Calling all_inventory to load vars for managed-node2 19285 1727203902.77403: Calling groups_inventory to load vars for managed-node2 19285 1727203902.77405: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203902.77410: Calling all_plugins_play to load vars for managed-node2 19285 1727203902.77412: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203902.77415: Calling groups_plugins_play to load vars for managed-node2 19285 1727203902.77555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203902.78001: done with get_vars() 19285 1727203902.78009: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:11 Tuesday 24 September 2024 14:51:42 -0400 (0:00:01.843) 0:00:01.857 ***** 19285 1727203902.78266: entering _queue_task() for managed-node2/include_tasks 19285 1727203902.78268: Creating lock for include_tasks 19285 1727203902.79000: worker is 1 (out of 1 available) 19285 1727203902.79020: exiting _queue_task() for managed-node2/include_tasks 19285 1727203902.79032: done queuing things up, now waiting for results queue to drain 19285 1727203902.79033: waiting for pending results... 19285 1727203902.79456: running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' 19285 1727203902.79667: in run() - task 028d2410-947f-f31b-fb3f-000000000006 19285 1727203902.79724: variable 'ansible_search_path' from source: unknown 19285 1727203902.79910: calling self._execute() 19285 1727203902.79989: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203902.80002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203902.80038: variable 'omit' from source: magic vars 19285 1727203902.80345: _execute() done 19285 1727203902.80349: dumping result to json 19285 1727203902.80353: done dumping result, returning 19285 1727203902.80356: done running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' [028d2410-947f-f31b-fb3f-000000000006] 19285 1727203902.80361: sending task result for task 028d2410-947f-f31b-fb3f-000000000006 19285 1727203902.80637: no more pending results, returning what we have 19285 1727203902.80643: in VariableManager get_vars() 19285 1727203902.80691: Calling all_inventory to load vars for managed-node2 19285 1727203902.80694: Calling groups_inventory to load vars for managed-node2 19285 1727203902.80699: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203902.80712: Calling all_plugins_play to load vars for managed-node2 19285 1727203902.80715: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203902.80718: Calling groups_plugins_play to load vars for managed-node2 19285 1727203902.81320: done sending task result for task 028d2410-947f-f31b-fb3f-000000000006 19285 1727203902.81323: WORKER PROCESS EXITING 19285 1727203902.81622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203902.82008: done with get_vars() 19285 1727203902.82017: variable 'ansible_search_path' from source: unknown 19285 1727203902.82032: we have included files to process 19285 1727203902.82033: generating all_blocks data 19285 1727203902.82034: done generating all_blocks data 19285 1727203902.82035: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 19285 1727203902.82036: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 19285 1727203902.82039: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 19285 1727203902.83762: in VariableManager get_vars() 19285 1727203902.83883: done with get_vars() 19285 1727203902.83896: done processing included file 19285 1727203902.83898: iterating over new_blocks loaded from include file 19285 1727203902.83900: in VariableManager get_vars() 19285 1727203902.83917: done with get_vars() 19285 1727203902.83919: filtering new block on tags 19285 1727203902.83933: done filtering new block on tags 19285 1727203902.83936: in VariableManager get_vars() 19285 1727203902.83947: done with get_vars() 19285 1727203902.83949: filtering new block on tags 19285 1727203902.83967: done filtering new block on tags 19285 1727203902.83970: in VariableManager get_vars() 19285 1727203902.83982: done with get_vars() 19285 1727203902.83984: filtering new block on tags 19285 1727203902.83996: done filtering new block on tags 19285 1727203902.83998: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node2 19285 1727203902.84004: extending task lists for all hosts with included blocks 19285 1727203902.84053: done extending task lists 19285 1727203902.84054: done processing included files 19285 1727203902.84055: results queue empty 19285 1727203902.84055: checking for any_errors_fatal 19285 1727203902.84057: done checking for any_errors_fatal 19285 1727203902.84057: checking for max_fail_percentage 19285 1727203902.84061: done checking for max_fail_percentage 19285 1727203902.84062: checking to see if all hosts have failed and the running result is not ok 19285 1727203902.84063: done checking to see if all hosts have failed 19285 1727203902.84063: getting the remaining hosts for this loop 19285 1727203902.84064: done getting the remaining hosts for this loop 19285 1727203902.84067: getting the next task for host managed-node2 19285 1727203902.84071: done getting next task for host managed-node2 19285 1727203902.84073: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 19285 1727203902.84279: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203902.84283: getting variables 19285 1727203902.84290: in VariableManager get_vars() 19285 1727203902.84298: Calling all_inventory to load vars for managed-node2 19285 1727203902.84300: Calling groups_inventory to load vars for managed-node2 19285 1727203902.84302: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203902.84308: Calling all_plugins_play to load vars for managed-node2 19285 1727203902.84310: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203902.84312: Calling groups_plugins_play to load vars for managed-node2 19285 1727203902.84673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203902.85065: done with get_vars() 19285 1727203902.85074: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 14:51:42 -0400 (0:00:00.071) 0:00:01.928 ***** 19285 1727203902.85401: entering _queue_task() for managed-node2/setup 19285 1727203902.86381: worker is 1 (out of 1 available) 19285 1727203902.86396: exiting _queue_task() for managed-node2/setup 19285 1727203902.86409: done queuing things up, now waiting for results queue to drain 19285 1727203902.86410: waiting for pending results... 19285 1727203902.86817: running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 19285 1727203902.87183: in run() - task 028d2410-947f-f31b-fb3f-00000000008f 19285 1727203902.87187: variable 'ansible_search_path' from source: unknown 19285 1727203902.87197: variable 'ansible_search_path' from source: unknown 19285 1727203902.87200: calling self._execute() 19285 1727203902.87259: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203902.87334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203902.87349: variable 'omit' from source: magic vars 19285 1727203902.88422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203902.92647: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203902.92835: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203902.93081: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203902.93086: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203902.93090: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203902.93314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203902.93318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203902.93321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203902.93480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203902.93484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203902.93785: variable 'ansible_facts' from source: unknown 19285 1727203902.93926: variable 'network_test_required_facts' from source: task vars 19285 1727203902.94002: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): False 19285 1727203902.94077: when evaluation is False, skipping this task 19285 1727203902.94087: _execute() done 19285 1727203902.94095: dumping result to json 19285 1727203902.94103: done dumping result, returning 19285 1727203902.94115: done running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [028d2410-947f-f31b-fb3f-00000000008f] 19285 1727203902.94289: sending task result for task 028d2410-947f-f31b-fb3f-00000000008f 19285 1727203902.94360: done sending task result for task 028d2410-947f-f31b-fb3f-00000000008f 19285 1727203902.94363: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts", "skip_reason": "Conditional result was False" } 19285 1727203902.94462: no more pending results, returning what we have 19285 1727203902.94466: results queue empty 19285 1727203902.94467: checking for any_errors_fatal 19285 1727203902.94468: done checking for any_errors_fatal 19285 1727203902.94469: checking for max_fail_percentage 19285 1727203902.94471: done checking for max_fail_percentage 19285 1727203902.94472: checking to see if all hosts have failed and the running result is not ok 19285 1727203902.94473: done checking to see if all hosts have failed 19285 1727203902.94474: getting the remaining hosts for this loop 19285 1727203902.94477: done getting the remaining hosts for this loop 19285 1727203902.94480: getting the next task for host managed-node2 19285 1727203902.94489: done getting next task for host managed-node2 19285 1727203902.94491: ^ task is: TASK: Check if system is ostree 19285 1727203902.94494: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203902.94496: getting variables 19285 1727203902.94499: in VariableManager get_vars() 19285 1727203902.94527: Calling all_inventory to load vars for managed-node2 19285 1727203902.94529: Calling groups_inventory to load vars for managed-node2 19285 1727203902.94533: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203902.94543: Calling all_plugins_play to load vars for managed-node2 19285 1727203902.94546: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203902.94548: Calling groups_plugins_play to load vars for managed-node2 19285 1727203902.95236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203902.95531: done with get_vars() 19285 1727203902.95542: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 14:51:42 -0400 (0:00:00.104) 0:00:02.032 ***** 19285 1727203902.95848: entering _queue_task() for managed-node2/stat 19285 1727203902.96520: worker is 1 (out of 1 available) 19285 1727203902.96531: exiting _queue_task() for managed-node2/stat 19285 1727203902.96765: done queuing things up, now waiting for results queue to drain 19285 1727203902.96767: waiting for pending results... 19285 1727203902.97198: running TaskExecutor() for managed-node2/TASK: Check if system is ostree 19285 1727203902.97281: in run() - task 028d2410-947f-f31b-fb3f-000000000091 19285 1727203902.97285: variable 'ansible_search_path' from source: unknown 19285 1727203902.97289: variable 'ansible_search_path' from source: unknown 19285 1727203902.97292: calling self._execute() 19285 1727203902.97470: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203902.97633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203902.97637: variable 'omit' from source: magic vars 19285 1727203902.98781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203902.99195: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203902.99243: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203902.99480: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203902.99485: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203902.99681: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203902.99686: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203902.99730: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203902.99763: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203903.00126: Evaluated conditional (not __network_is_ostree is defined): True 19285 1727203903.00130: variable 'omit' from source: magic vars 19285 1727203903.00344: variable 'omit' from source: magic vars 19285 1727203903.00347: variable 'omit' from source: magic vars 19285 1727203903.00350: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203903.00352: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203903.00562: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203903.00566: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203903.00568: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203903.00571: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203903.00573: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203903.00576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203903.00739: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203903.00795: Set connection var ansible_pipelining to False 19285 1727203903.00811: Set connection var ansible_timeout to 10 19285 1727203903.00980: Set connection var ansible_shell_type to sh 19285 1727203903.00983: Set connection var ansible_shell_executable to /bin/sh 19285 1727203903.00986: Set connection var ansible_connection to ssh 19285 1727203903.00988: variable 'ansible_shell_executable' from source: unknown 19285 1727203903.00992: variable 'ansible_connection' from source: unknown 19285 1727203903.00995: variable 'ansible_module_compression' from source: unknown 19285 1727203903.00997: variable 'ansible_shell_type' from source: unknown 19285 1727203903.00999: variable 'ansible_shell_executable' from source: unknown 19285 1727203903.01001: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203903.01003: variable 'ansible_pipelining' from source: unknown 19285 1727203903.01005: variable 'ansible_timeout' from source: unknown 19285 1727203903.01007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203903.01284: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19285 1727203903.01288: variable 'omit' from source: magic vars 19285 1727203903.01291: starting attempt loop 19285 1727203903.01293: running the handler 19285 1727203903.01295: _low_level_execute_command(): starting 19285 1727203903.01297: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203903.03478: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203903.03747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203903.03939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203903.04266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203903.05891: stdout chunk (state=3): >>>/root <<< 19285 1727203903.06532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203903.06535: stdout chunk (state=3): >>><<< 19285 1727203903.06538: stderr chunk (state=3): >>><<< 19285 1727203903.06540: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203903.06550: _low_level_execute_command(): starting 19285 1727203903.06553: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203903.0643332-19587-242293927181521 `" && echo ansible-tmp-1727203903.0643332-19587-242293927181521="` echo /root/.ansible/tmp/ansible-tmp-1727203903.0643332-19587-242293927181521 `" ) && sleep 0' 19285 1727203903.07688: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203903.07794: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203903.07848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 19285 1727203903.07861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203903.08030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203903.08189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203903.10137: stdout chunk (state=3): >>>ansible-tmp-1727203903.0643332-19587-242293927181521=/root/.ansible/tmp/ansible-tmp-1727203903.0643332-19587-242293927181521 <<< 19285 1727203903.10285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203903.10297: stdout chunk (state=3): >>><<< 19285 1727203903.10332: stderr chunk (state=3): >>><<< 19285 1727203903.10482: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203903.0643332-19587-242293927181521=/root/.ansible/tmp/ansible-tmp-1727203903.0643332-19587-242293927181521 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203903.10486: variable 'ansible_module_compression' from source: unknown 19285 1727203903.10573: ANSIBALLZ: Using lock for stat 19285 1727203903.10635: ANSIBALLZ: Acquiring lock 19285 1727203903.10643: ANSIBALLZ: Lock acquired: 140487240914400 19285 1727203903.10657: ANSIBALLZ: Creating module 19285 1727203903.39664: ANSIBALLZ: Writing module into payload 19285 1727203903.39872: ANSIBALLZ: Writing module 19285 1727203903.40082: ANSIBALLZ: Renaming module 19285 1727203903.40086: ANSIBALLZ: Done creating module 19285 1727203903.40088: variable 'ansible_facts' from source: unknown 19285 1727203903.40318: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203903.0643332-19587-242293927181521/AnsiballZ_stat.py 19285 1727203903.40550: Sending initial data 19285 1727203903.40564: Sent initial data (153 bytes) 19285 1727203903.41691: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203903.41705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 19285 1727203903.41720: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203903.42072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203903.42611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19285 1727203903.44727: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 19285 1727203903.44735: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 19285 1727203903.44742: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 19285 1727203903.44750: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 19285 1727203903.44756: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 19285 1727203903.44764: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 19285 1727203903.44801: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203903.45127: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203903.0643332-19587-242293927181521/AnsiballZ_stat.py" <<< 19285 1727203903.45130: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmpa4v9p94h /root/.ansible/tmp/ansible-tmp-1727203903.0643332-19587-242293927181521/AnsiballZ_stat.py <<< 19285 1727203903.45299: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmpa4v9p94h" to remote "/root/.ansible/tmp/ansible-tmp-1727203903.0643332-19587-242293927181521/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203903.0643332-19587-242293927181521/AnsiballZ_stat.py" <<< 19285 1727203903.47415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203903.47438: stderr chunk (state=3): >>><<< 19285 1727203903.47442: stdout chunk (state=3): >>><<< 19285 1727203903.47478: done transferring module to remote 19285 1727203903.47580: _low_level_execute_command(): starting 19285 1727203903.47584: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203903.0643332-19587-242293927181521/ /root/.ansible/tmp/ansible-tmp-1727203903.0643332-19587-242293927181521/AnsiballZ_stat.py && sleep 0' 19285 1727203903.48954: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203903.49012: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203903.49023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203903.49037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203903.49164: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203903.49167: stderr chunk (state=3): >>>debug2: match not found <<< 19285 1727203903.49170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203903.49172: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19285 1727203903.49174: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 19285 1727203903.49178: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19285 1727203903.49181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203903.49182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203903.49193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203903.49579: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203903.49694: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203903.49933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203903.51638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203903.51641: stdout chunk (state=3): >>><<< 19285 1727203903.51681: stderr chunk (state=3): >>><<< 19285 1727203903.51684: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203903.51687: _low_level_execute_command(): starting 19285 1727203903.51689: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203903.0643332-19587-242293927181521/AnsiballZ_stat.py && sleep 0' 19285 1727203903.53118: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203903.53180: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203903.53231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203903.53285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203903.53551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203903.56480: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 19285 1727203903.56653: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # <<< 19285 1727203903.56873: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35538e84d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35538b7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35538eaa50> <<< 19285 1727203903.56916: stdout chunk (state=3): >>>import '_signal' # import '_abc' # import 'abc' # <<< 19285 1727203903.57026: stdout chunk (state=3): >>>import 'io' # import '_stat' # import 'stat' # <<< 19285 1727203903.57094: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # <<< 19285 1727203903.57103: stdout chunk (state=3): >>>Processing user site-packages <<< 19285 1727203903.57154: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 19285 1727203903.57157: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 19285 1727203903.57172: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 19285 1727203903.57353: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3553699130> <<< 19285 1727203903.57400: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f355369a060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 19285 1727203903.57533: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 19285 1727203903.57573: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 19285 1727203903.57587: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 19285 1727203903.57642: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 19285 1727203903.57645: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 19285 1727203903.57686: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 19285 1727203903.57693: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35536d7f50> <<< 19285 1727203903.57904: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35536ec0e0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f355370f980> <<< 19285 1727203903.57930: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 19285 1727203903.57943: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f355370ff50> <<< 19285 1727203903.57957: stdout chunk (state=3): >>>import '_collections' # <<< 19285 1727203903.58007: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35536efc20> import '_functools' # <<< 19285 1727203903.58112: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35536ed340> <<< 19285 1727203903.58130: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35536d5100> <<< 19285 1727203903.58359: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3553733950> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3553732570> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35536ee210> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3553730d70> <<< 19285 1727203903.58365: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 19285 1727203903.58405: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3553760950> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35536d4380> <<< 19285 1727203903.58411: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 19285 1727203903.58456: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 19285 1727203903.58493: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3553760e00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3553760cb0> <<< 19285 1727203903.58554: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 19285 1727203903.58607: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35537610a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35536d2ea0> <<< 19285 1727203903.58663: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 19285 1727203903.58669: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3553761760> <<< 19285 1727203903.58671: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3553761460> <<< 19285 1727203903.58673: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 19285 1727203903.58837: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 19285 1727203903.58841: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3553762660> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f355377c860> <<< 19285 1727203903.58844: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f355377dfa0> <<< 19285 1727203903.58994: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f355377ee40> <<< 19285 1727203903.59001: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f355377f4a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f355377e390> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f355377ff20> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f355377f650> <<< 19285 1727203903.59050: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3553762690> <<< 19285 1727203903.59119: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 19285 1727203903.59219: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3553503da0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f355352c8f0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f355352c650> <<< 19285 1727203903.59225: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 19285 1727203903.59797: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f355352c920> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f355352d250> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f355352dc40> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f355352cb00> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3553501f40> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f355352f050> <<< 19285 1727203903.59903: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f355352dd90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3553762d80> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 19285 1727203903.59947: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 19285 1727203903.60082: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35535573e0> <<< 19285 1727203903.60115: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 19285 1727203903.60123: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 19285 1727203903.60172: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 19285 1727203903.60367: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f355357b7a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 19285 1727203903.60393: stdout chunk (state=3): >>>import 'ntpath' # <<< 19285 1727203903.60425: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35535d8590> <<< 19285 1727203903.60432: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 19285 1727203903.60498: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 19285 1727203903.60588: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 19285 1727203903.60727: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35535dacf0> <<< 19285 1727203903.60790: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35535d86b0> <<< 19285 1727203903.60861: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35535a15b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552f29700> <<< 19285 1727203903.60916: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f355357a5a0> <<< 19285 1727203903.60929: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f355352ffb0> <<< 19285 1727203903.61126: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f355357a900> <<< 19285 1727203903.61285: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_488dqboz/ansible_stat_payload.zip' <<< 19285 1727203903.61435: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203903.61511: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203903.61567: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 19285 1727203903.61587: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 19285 1727203903.61606: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 19285 1727203903.61714: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 19285 1727203903.61747: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552f7b470> <<< 19285 1727203903.61806: stdout chunk (state=3): >>>import '_typing' # <<< 19285 1727203903.62036: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552f5e360> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552f5d4c0> <<< 19285 1727203903.62067: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203903.62095: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 19285 1727203903.62118: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 19285 1727203903.62151: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 19285 1727203903.63857: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203903.64962: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552f79340> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 19285 1727203903.64968: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552fa6de0> <<< 19285 1727203903.64981: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552fa6b70> <<< 19285 1727203903.65014: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552fa6480> <<< 19285 1727203903.65032: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 19285 1727203903.65043: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 19285 1727203903.65077: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552fa6ed0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35538ea9c0> <<< 19285 1727203903.65194: stdout chunk (state=3): >>>import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552fa7b00> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552fa7d40> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 19285 1727203903.65229: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 19285 1727203903.65298: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552fcc230> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 19285 1727203903.65401: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e11f10> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552e13b30> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 19285 1727203903.65493: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e14500> <<< 19285 1727203903.65562: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e153d0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 19285 1727203903.65585: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 19285 1727203903.65623: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e200e0> <<< 19285 1727203903.65669: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552e20230> <<< 19285 1727203903.65806: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e163c0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 19285 1727203903.65999: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e23f80> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e22a50> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e227b0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e22d20> <<< 19285 1727203903.66108: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e168d0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552e681a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e68350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 19285 1727203903.66127: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 19285 1727203903.66165: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 19285 1727203903.66280: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552e69df0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e69bb0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 19285 1727203903.66367: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 19285 1727203903.66370: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 19285 1727203903.66377: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552e6c2f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e6a4e0> <<< 19285 1727203903.66398: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 19285 1727203903.66426: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 19285 1727203903.66500: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e6faa0> <<< 19285 1727203903.66622: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e6c4a0> <<< 19285 1727203903.66680: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 19285 1727203903.66834: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552e705c0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 19285 1727203903.66837: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552e70e90> <<< 19285 1727203903.66843: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552e70920> <<< 19285 1727203903.66892: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e68470> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 19285 1727203903.66952: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552efc260> <<< 19285 1727203903.67012: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 19285 1727203903.67025: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552efd520> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e72a20> <<< 19285 1727203903.67391: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552e73da0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e72630> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 19285 1727203903.67449: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203903.67563: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203903.68097: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203903.68795: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 19285 1727203903.68823: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552d01820> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552d027b0> <<< 19285 1727203903.68827: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552efd760> <<< 19285 1727203903.68907: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available <<< 19285 1727203903.68918: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 19285 1727203903.69199: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203903.69217: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 19285 1727203903.69274: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552d02c60> # zipimport: zlib available <<< 19285 1727203903.69733: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203903.70117: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203903.70390: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 19285 1727203903.70597: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.errors' # <<< 19285 1727203903.70614: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 19285 1727203903.70836: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203903.71158: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 19285 1727203903.71200: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552d039e0> <<< 19285 1727203903.71204: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203903.71393: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 19285 1727203903.71413: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203903.71466: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 19285 1727203903.71702: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 19285 1727203903.71727: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 19285 1727203903.71787: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 19285 1727203903.71799: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552d0e3c0> <<< 19285 1727203903.71845: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552d0b650> <<< 19285 1727203903.71884: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 19285 1727203903.71928: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 19285 1727203903.72067: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 19285 1727203903.72258: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552fd2ba0> <<< 19285 1727203903.72299: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552fee870> <<< 19285 1727203903.72504: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552d0e480> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552d031d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 19285 1727203903.72598: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available <<< 19285 1727203903.72601: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 19285 1727203903.72668: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203903.72863: stdout chunk (state=3): >>># zipimport: zlib available <<< 19285 1727203903.72974: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 19285 1727203903.73019: stdout chunk (state=3): >>># destroy __main__ <<< 19285 1727203903.73355: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 19285 1727203903.73362: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback <<< 19285 1727203903.73365: stdout chunk (state=3): >>># clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io <<< 19285 1727203903.73367: stdout chunk (state=3): >>># cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings <<< 19285 1727203903.73417: stdout chunk (state=3): >>># cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression <<< 19285 1727203903.73502: stdout chunk (state=3): >>># cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse <<< 19285 1727203903.73636: stdout chunk (state=3): >>># cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 19285 1727203903.73939: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 19285 1727203903.73997: stdout chunk (state=3): >>># destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 19285 1727203903.74078: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap <<< 19285 1727203903.74315: stdout chunk (state=3): >>># cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 19285 1727203903.74424: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing <<< 19285 1727203903.74506: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 19285 1727203903.74616: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 19285 1727203903.74631: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _string # destroy re <<< 19285 1727203903.74654: stdout chunk (state=3): >>># destroy itertools <<< 19285 1727203903.74791: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 19285 1727203903.75253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203903.75256: stdout chunk (state=3): >>><<< 19285 1727203903.75258: stderr chunk (state=3): >>><<< 19285 1727203903.75591: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35538e84d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35538b7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35538eaa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3553699130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f355369a060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35536d7f50> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35536ec0e0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f355370f980> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f355370ff50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35536efc20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35536ed340> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35536d5100> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3553733950> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3553732570> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35536ee210> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3553730d70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3553760950> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35536d4380> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3553760e00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3553760cb0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35537610a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35536d2ea0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3553761760> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3553761460> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3553762660> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f355377c860> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f355377dfa0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f355377ee40> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f355377f4a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f355377e390> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f355377ff20> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f355377f650> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3553762690> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3553503da0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f355352c8f0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f355352c650> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f355352c920> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f355352d250> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f355352dc40> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f355352cb00> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3553501f40> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f355352f050> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f355352dd90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3553762d80> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35535573e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f355357b7a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35535d8590> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35535dacf0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35535d86b0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35535a15b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552f29700> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f355357a5a0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f355352ffb0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f355357a900> # zipimport: found 30 names in '/tmp/ansible_stat_payload_488dqboz/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552f7b470> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552f5e360> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552f5d4c0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552f79340> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552fa6de0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552fa6b70> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552fa6480> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552fa6ed0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35538ea9c0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552fa7b00> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552fa7d40> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552fcc230> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e11f10> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552e13b30> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e14500> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e153d0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e200e0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552e20230> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e163c0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e23f80> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e22a50> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e227b0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e22d20> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e168d0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552e681a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e68350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552e69df0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e69bb0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552e6c2f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e6a4e0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e6faa0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e6c4a0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552e705c0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552e70e90> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552e70920> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e68470> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552efc260> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552efd520> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e72a20> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552e73da0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552e72630> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552d01820> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552d027b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552efd760> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552d02c60> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552d039e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3552d0e3c0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552d0b650> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552fd2ba0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552fee870> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552d0e480> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3552d031d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 19285 1727203903.77021: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203903.0643332-19587-242293927181521/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203903.77025: _low_level_execute_command(): starting 19285 1727203903.77028: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203903.0643332-19587-242293927181521/ > /dev/null 2>&1 && sleep 0' 19285 1727203903.77356: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 19285 1727203903.77377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203903.77398: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19285 1727203903.77491: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203903.77679: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203903.77714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203903.77787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203903.79782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203903.79803: stdout chunk (state=3): >>><<< 19285 1727203903.79806: stderr chunk (state=3): >>><<< 19285 1727203903.79822: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203903.79891: handler run complete 19285 1727203903.79930: attempt loop complete, returning result 19285 1727203903.80165: _execute() done 19285 1727203903.80169: dumping result to json 19285 1727203903.80171: done dumping result, returning 19285 1727203903.80173: done running TaskExecutor() for managed-node2/TASK: Check if system is ostree [028d2410-947f-f31b-fb3f-000000000091] 19285 1727203903.80383: sending task result for task 028d2410-947f-f31b-fb3f-000000000091 19285 1727203903.80448: done sending task result for task 028d2410-947f-f31b-fb3f-000000000091 19285 1727203903.80452: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 19285 1727203903.80613: no more pending results, returning what we have 19285 1727203903.80616: results queue empty 19285 1727203903.80618: checking for any_errors_fatal 19285 1727203903.80621: done checking for any_errors_fatal 19285 1727203903.80622: checking for max_fail_percentage 19285 1727203903.80624: done checking for max_fail_percentage 19285 1727203903.80625: checking to see if all hosts have failed and the running result is not ok 19285 1727203903.80625: done checking to see if all hosts have failed 19285 1727203903.80626: getting the remaining hosts for this loop 19285 1727203903.80628: done getting the remaining hosts for this loop 19285 1727203903.80631: getting the next task for host managed-node2 19285 1727203903.80637: done getting next task for host managed-node2 19285 1727203903.80640: ^ task is: TASK: Set flag to indicate system is ostree 19285 1727203903.80642: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203903.80645: getting variables 19285 1727203903.80647: in VariableManager get_vars() 19285 1727203903.80674: Calling all_inventory to load vars for managed-node2 19285 1727203903.80984: Calling groups_inventory to load vars for managed-node2 19285 1727203903.80988: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203903.80998: Calling all_plugins_play to load vars for managed-node2 19285 1727203903.81001: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203903.81004: Calling groups_plugins_play to load vars for managed-node2 19285 1727203903.81162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203903.81653: done with get_vars() 19285 1727203903.81665: done getting variables 19285 1727203903.82197: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 14:51:43 -0400 (0:00:00.863) 0:00:02.896 ***** 19285 1727203903.82226: entering _queue_task() for managed-node2/set_fact 19285 1727203903.82228: Creating lock for set_fact 19285 1727203903.83439: worker is 1 (out of 1 available) 19285 1727203903.83452: exiting _queue_task() for managed-node2/set_fact 19285 1727203903.83466: done queuing things up, now waiting for results queue to drain 19285 1727203903.83467: waiting for pending results... 19285 1727203903.84001: running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree 19285 1727203903.84684: in run() - task 028d2410-947f-f31b-fb3f-000000000092 19285 1727203903.84687: variable 'ansible_search_path' from source: unknown 19285 1727203903.84691: variable 'ansible_search_path' from source: unknown 19285 1727203903.84693: calling self._execute() 19285 1727203903.85088: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203903.85092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203903.85095: variable 'omit' from source: magic vars 19285 1727203903.86525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203903.87132: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203903.87329: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203903.87369: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203903.87487: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203903.87687: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203903.87791: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203903.87809: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203903.87834: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203903.88189: Evaluated conditional (not __network_is_ostree is defined): True 19285 1727203903.88202: variable 'omit' from source: magic vars 19285 1727203903.88401: variable 'omit' from source: magic vars 19285 1727203903.88565: variable '__ostree_booted_stat' from source: set_fact 19285 1727203903.88682: variable 'omit' from source: magic vars 19285 1727203903.88763: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203903.89056: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203903.89060: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203903.89062: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203903.89064: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203903.89089: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203903.89099: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203903.89108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203903.89343: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203903.89357: Set connection var ansible_pipelining to False 19285 1727203903.89371: Set connection var ansible_timeout to 10 19285 1727203903.89600: Set connection var ansible_shell_type to sh 19285 1727203903.89603: Set connection var ansible_shell_executable to /bin/sh 19285 1727203903.89605: Set connection var ansible_connection to ssh 19285 1727203903.89607: variable 'ansible_shell_executable' from source: unknown 19285 1727203903.89610: variable 'ansible_connection' from source: unknown 19285 1727203903.89613: variable 'ansible_module_compression' from source: unknown 19285 1727203903.89615: variable 'ansible_shell_type' from source: unknown 19285 1727203903.89620: variable 'ansible_shell_executable' from source: unknown 19285 1727203903.89623: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203903.89625: variable 'ansible_pipelining' from source: unknown 19285 1727203903.89626: variable 'ansible_timeout' from source: unknown 19285 1727203903.89628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203903.89837: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203903.90037: variable 'omit' from source: magic vars 19285 1727203903.90044: starting attempt loop 19285 1727203903.90047: running the handler 19285 1727203903.90049: handler run complete 19285 1727203903.90052: attempt loop complete, returning result 19285 1727203903.90054: _execute() done 19285 1727203903.90056: dumping result to json 19285 1727203903.90058: done dumping result, returning 19285 1727203903.90062: done running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree [028d2410-947f-f31b-fb3f-000000000092] 19285 1727203903.90064: sending task result for task 028d2410-947f-f31b-fb3f-000000000092 ok: [managed-node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 19285 1727203903.90415: no more pending results, returning what we have 19285 1727203903.90418: results queue empty 19285 1727203903.90419: checking for any_errors_fatal 19285 1727203903.90423: done checking for any_errors_fatal 19285 1727203903.90424: checking for max_fail_percentage 19285 1727203903.90426: done checking for max_fail_percentage 19285 1727203903.90427: checking to see if all hosts have failed and the running result is not ok 19285 1727203903.90428: done checking to see if all hosts have failed 19285 1727203903.90429: getting the remaining hosts for this loop 19285 1727203903.90431: done getting the remaining hosts for this loop 19285 1727203903.90434: getting the next task for host managed-node2 19285 1727203903.90445: done getting next task for host managed-node2 19285 1727203903.90450: ^ task is: TASK: Fix CentOS6 Base repo 19285 1727203903.90452: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203903.90456: getting variables 19285 1727203903.90462: in VariableManager get_vars() 19285 1727203903.90496: Calling all_inventory to load vars for managed-node2 19285 1727203903.90499: Calling groups_inventory to load vars for managed-node2 19285 1727203903.90503: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203903.90513: Calling all_plugins_play to load vars for managed-node2 19285 1727203903.90516: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203903.90519: Calling groups_plugins_play to load vars for managed-node2 19285 1727203903.91093: done sending task result for task 028d2410-947f-f31b-fb3f-000000000092 19285 1727203903.91104: WORKER PROCESS EXITING 19285 1727203903.91128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203903.91574: done with get_vars() 19285 1727203903.91645: done getting variables 19285 1727203903.91890: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 14:51:43 -0400 (0:00:00.096) 0:00:02.993 ***** 19285 1727203903.91918: entering _queue_task() for managed-node2/copy 19285 1727203903.92649: worker is 1 (out of 1 available) 19285 1727203903.92659: exiting _queue_task() for managed-node2/copy 19285 1727203903.92670: done queuing things up, now waiting for results queue to drain 19285 1727203903.92671: waiting for pending results... 19285 1727203903.92998: running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo 19285 1727203903.93381: in run() - task 028d2410-947f-f31b-fb3f-000000000094 19285 1727203903.93385: variable 'ansible_search_path' from source: unknown 19285 1727203903.93387: variable 'ansible_search_path' from source: unknown 19285 1727203903.93390: calling self._execute() 19285 1727203903.93581: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203903.93586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203903.93588: variable 'omit' from source: magic vars 19285 1727203903.94579: variable 'ansible_distribution' from source: facts 19285 1727203903.94583: Evaluated conditional (ansible_distribution == 'CentOS'): True 19285 1727203903.94717: variable 'ansible_distribution_major_version' from source: facts 19285 1727203903.94802: Evaluated conditional (ansible_distribution_major_version == '6'): False 19285 1727203903.94809: when evaluation is False, skipping this task 19285 1727203903.94815: _execute() done 19285 1727203903.94821: dumping result to json 19285 1727203903.94826: done dumping result, returning 19285 1727203903.94834: done running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo [028d2410-947f-f31b-fb3f-000000000094] 19285 1727203903.94842: sending task result for task 028d2410-947f-f31b-fb3f-000000000094 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 19285 1727203903.95052: no more pending results, returning what we have 19285 1727203903.95056: results queue empty 19285 1727203903.95057: checking for any_errors_fatal 19285 1727203903.95063: done checking for any_errors_fatal 19285 1727203903.95063: checking for max_fail_percentage 19285 1727203903.95067: done checking for max_fail_percentage 19285 1727203903.95068: checking to see if all hosts have failed and the running result is not ok 19285 1727203903.95069: done checking to see if all hosts have failed 19285 1727203903.95070: getting the remaining hosts for this loop 19285 1727203903.95071: done getting the remaining hosts for this loop 19285 1727203903.95074: getting the next task for host managed-node2 19285 1727203903.95088: done getting next task for host managed-node2 19285 1727203903.95091: ^ task is: TASK: Include the task 'enable_epel.yml' 19285 1727203903.95094: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203903.95098: getting variables 19285 1727203903.95102: in VariableManager get_vars() 19285 1727203903.95306: Calling all_inventory to load vars for managed-node2 19285 1727203903.95309: Calling groups_inventory to load vars for managed-node2 19285 1727203903.95314: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203903.95327: Calling all_plugins_play to load vars for managed-node2 19285 1727203903.95332: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203903.95336: Calling groups_plugins_play to load vars for managed-node2 19285 1727203903.95833: done sending task result for task 028d2410-947f-f31b-fb3f-000000000094 19285 1727203903.95836: WORKER PROCESS EXITING 19285 1727203903.95864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203903.96224: done with get_vars() 19285 1727203903.96234: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 14:51:43 -0400 (0:00:00.045) 0:00:03.038 ***** 19285 1727203903.96449: entering _queue_task() for managed-node2/include_tasks 19285 1727203903.97191: worker is 1 (out of 1 available) 19285 1727203903.97204: exiting _queue_task() for managed-node2/include_tasks 19285 1727203903.97215: done queuing things up, now waiting for results queue to drain 19285 1727203903.97454: waiting for pending results... 19285 1727203903.97799: running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' 19285 1727203903.98030: in run() - task 028d2410-947f-f31b-fb3f-000000000095 19285 1727203903.98061: variable 'ansible_search_path' from source: unknown 19285 1727203903.98097: variable 'ansible_search_path' from source: unknown 19285 1727203903.98181: calling self._execute() 19285 1727203903.98341: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203903.98370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203903.98434: variable 'omit' from source: magic vars 19285 1727203903.99894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203904.05240: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203904.05465: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203904.05558: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203904.05632: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203904.05666: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203904.05755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203904.05793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203904.05835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203904.05878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203904.05945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203904.06028: variable '__network_is_ostree' from source: set_fact 19285 1727203904.06055: Evaluated conditional (not __network_is_ostree | d(false)): True 19285 1727203904.06070: _execute() done 19285 1727203904.06084: dumping result to json 19285 1727203904.06092: done dumping result, returning 19285 1727203904.06102: done running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' [028d2410-947f-f31b-fb3f-000000000095] 19285 1727203904.06112: sending task result for task 028d2410-947f-f31b-fb3f-000000000095 19285 1727203904.06406: no more pending results, returning what we have 19285 1727203904.06412: in VariableManager get_vars() 19285 1727203904.06449: Calling all_inventory to load vars for managed-node2 19285 1727203904.06452: Calling groups_inventory to load vars for managed-node2 19285 1727203904.06457: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203904.06469: Calling all_plugins_play to load vars for managed-node2 19285 1727203904.06472: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203904.06478: Calling groups_plugins_play to load vars for managed-node2 19285 1727203904.07056: done sending task result for task 028d2410-947f-f31b-fb3f-000000000095 19285 1727203904.07059: WORKER PROCESS EXITING 19285 1727203904.07085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203904.07499: done with get_vars() 19285 1727203904.07506: variable 'ansible_search_path' from source: unknown 19285 1727203904.07507: variable 'ansible_search_path' from source: unknown 19285 1727203904.07656: we have included files to process 19285 1727203904.07657: generating all_blocks data 19285 1727203904.07659: done generating all_blocks data 19285 1727203904.07663: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 19285 1727203904.07664: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 19285 1727203904.07666: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 19285 1727203904.09241: done processing included file 19285 1727203904.09243: iterating over new_blocks loaded from include file 19285 1727203904.09244: in VariableManager get_vars() 19285 1727203904.09257: done with get_vars() 19285 1727203904.09261: filtering new block on tags 19285 1727203904.09399: done filtering new block on tags 19285 1727203904.09402: in VariableManager get_vars() 19285 1727203904.09415: done with get_vars() 19285 1727203904.09417: filtering new block on tags 19285 1727203904.09428: done filtering new block on tags 19285 1727203904.09430: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node2 19285 1727203904.09436: extending task lists for all hosts with included blocks 19285 1727203904.09639: done extending task lists 19285 1727203904.09641: done processing included files 19285 1727203904.09642: results queue empty 19285 1727203904.09642: checking for any_errors_fatal 19285 1727203904.09645: done checking for any_errors_fatal 19285 1727203904.09646: checking for max_fail_percentage 19285 1727203904.09647: done checking for max_fail_percentage 19285 1727203904.09648: checking to see if all hosts have failed and the running result is not ok 19285 1727203904.09649: done checking to see if all hosts have failed 19285 1727203904.09650: getting the remaining hosts for this loop 19285 1727203904.09651: done getting the remaining hosts for this loop 19285 1727203904.09653: getting the next task for host managed-node2 19285 1727203904.09657: done getting next task for host managed-node2 19285 1727203904.09659: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 19285 1727203904.09662: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203904.09664: getting variables 19285 1727203904.09665: in VariableManager get_vars() 19285 1727203904.09677: Calling all_inventory to load vars for managed-node2 19285 1727203904.09679: Calling groups_inventory to load vars for managed-node2 19285 1727203904.09682: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203904.09689: Calling all_plugins_play to load vars for managed-node2 19285 1727203904.09698: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203904.09703: Calling groups_plugins_play to load vars for managed-node2 19285 1727203904.10149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203904.10344: done with get_vars() 19285 1727203904.10352: done getting variables 19285 1727203904.10425: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 19285 1727203904.10621: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 14:51:44 -0400 (0:00:00.142) 0:00:03.181 ***** 19285 1727203904.10666: entering _queue_task() for managed-node2/command 19285 1727203904.10668: Creating lock for command 19285 1727203904.11072: worker is 1 (out of 1 available) 19285 1727203904.11240: exiting _queue_task() for managed-node2/command 19285 1727203904.11251: done queuing things up, now waiting for results queue to drain 19285 1727203904.11252: waiting for pending results... 19285 1727203904.11366: running TaskExecutor() for managed-node2/TASK: Create EPEL 10 19285 1727203904.11541: in run() - task 028d2410-947f-f31b-fb3f-0000000000af 19285 1727203904.11564: variable 'ansible_search_path' from source: unknown 19285 1727203904.11570: variable 'ansible_search_path' from source: unknown 19285 1727203904.11612: calling self._execute() 19285 1727203904.11689: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203904.11708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203904.11736: variable 'omit' from source: magic vars 19285 1727203904.12190: variable 'ansible_distribution' from source: facts 19285 1727203904.12274: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 19285 1727203904.12392: variable 'ansible_distribution_major_version' from source: facts 19285 1727203904.12404: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 19285 1727203904.12431: when evaluation is False, skipping this task 19285 1727203904.12434: _execute() done 19285 1727203904.12436: dumping result to json 19285 1727203904.12439: done dumping result, returning 19285 1727203904.12441: done running TaskExecutor() for managed-node2/TASK: Create EPEL 10 [028d2410-947f-f31b-fb3f-0000000000af] 19285 1727203904.12541: sending task result for task 028d2410-947f-f31b-fb3f-0000000000af 19285 1727203904.12618: done sending task result for task 028d2410-947f-f31b-fb3f-0000000000af 19285 1727203904.12622: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 19285 1727203904.12678: no more pending results, returning what we have 19285 1727203904.12682: results queue empty 19285 1727203904.12684: checking for any_errors_fatal 19285 1727203904.12685: done checking for any_errors_fatal 19285 1727203904.12686: checking for max_fail_percentage 19285 1727203904.12687: done checking for max_fail_percentage 19285 1727203904.12688: checking to see if all hosts have failed and the running result is not ok 19285 1727203904.12689: done checking to see if all hosts have failed 19285 1727203904.12690: getting the remaining hosts for this loop 19285 1727203904.12691: done getting the remaining hosts for this loop 19285 1727203904.12695: getting the next task for host managed-node2 19285 1727203904.12702: done getting next task for host managed-node2 19285 1727203904.12705: ^ task is: TASK: Install yum-utils package 19285 1727203904.12709: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203904.12713: getting variables 19285 1727203904.12714: in VariableManager get_vars() 19285 1727203904.12742: Calling all_inventory to load vars for managed-node2 19285 1727203904.12745: Calling groups_inventory to load vars for managed-node2 19285 1727203904.12749: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203904.12877: Calling all_plugins_play to load vars for managed-node2 19285 1727203904.12881: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203904.12884: Calling groups_plugins_play to load vars for managed-node2 19285 1727203904.13364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203904.13647: done with get_vars() 19285 1727203904.13655: done getting variables 19285 1727203904.14082: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 14:51:44 -0400 (0:00:00.034) 0:00:03.215 ***** 19285 1727203904.14127: entering _queue_task() for managed-node2/package 19285 1727203904.14129: Creating lock for package 19285 1727203904.14875: worker is 1 (out of 1 available) 19285 1727203904.14887: exiting _queue_task() for managed-node2/package 19285 1727203904.14897: done queuing things up, now waiting for results queue to drain 19285 1727203904.14898: waiting for pending results... 19285 1727203904.15445: running TaskExecutor() for managed-node2/TASK: Install yum-utils package 19285 1727203904.15586: in run() - task 028d2410-947f-f31b-fb3f-0000000000b0 19285 1727203904.15851: variable 'ansible_search_path' from source: unknown 19285 1727203904.15855: variable 'ansible_search_path' from source: unknown 19285 1727203904.15859: calling self._execute() 19285 1727203904.15962: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203904.15981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203904.15997: variable 'omit' from source: magic vars 19285 1727203904.16904: variable 'ansible_distribution' from source: facts 19285 1727203904.16974: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 19285 1727203904.17247: variable 'ansible_distribution_major_version' from source: facts 19285 1727203904.17289: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 19285 1727203904.17297: when evaluation is False, skipping this task 19285 1727203904.17305: _execute() done 19285 1727203904.17311: dumping result to json 19285 1727203904.17332: done dumping result, returning 19285 1727203904.17343: done running TaskExecutor() for managed-node2/TASK: Install yum-utils package [028d2410-947f-f31b-fb3f-0000000000b0] 19285 1727203904.17484: sending task result for task 028d2410-947f-f31b-fb3f-0000000000b0 19285 1727203904.17681: done sending task result for task 028d2410-947f-f31b-fb3f-0000000000b0 19285 1727203904.17684: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 19285 1727203904.17738: no more pending results, returning what we have 19285 1727203904.17742: results queue empty 19285 1727203904.17743: checking for any_errors_fatal 19285 1727203904.17753: done checking for any_errors_fatal 19285 1727203904.17754: checking for max_fail_percentage 19285 1727203904.17755: done checking for max_fail_percentage 19285 1727203904.17756: checking to see if all hosts have failed and the running result is not ok 19285 1727203904.17757: done checking to see if all hosts have failed 19285 1727203904.17759: getting the remaining hosts for this loop 19285 1727203904.17761: done getting the remaining hosts for this loop 19285 1727203904.17768: getting the next task for host managed-node2 19285 1727203904.17793: done getting next task for host managed-node2 19285 1727203904.17796: ^ task is: TASK: Enable EPEL 7 19285 1727203904.17801: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203904.17804: getting variables 19285 1727203904.17806: in VariableManager get_vars() 19285 1727203904.17836: Calling all_inventory to load vars for managed-node2 19285 1727203904.17839: Calling groups_inventory to load vars for managed-node2 19285 1727203904.17995: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203904.18010: Calling all_plugins_play to load vars for managed-node2 19285 1727203904.18013: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203904.18016: Calling groups_plugins_play to load vars for managed-node2 19285 1727203904.18720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203904.19062: done with get_vars() 19285 1727203904.19077: done getting variables 19285 1727203904.19140: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 14:51:44 -0400 (0:00:00.050) 0:00:03.266 ***** 19285 1727203904.19174: entering _queue_task() for managed-node2/command 19285 1727203904.19988: worker is 1 (out of 1 available) 19285 1727203904.20000: exiting _queue_task() for managed-node2/command 19285 1727203904.20013: done queuing things up, now waiting for results queue to drain 19285 1727203904.20014: waiting for pending results... 19285 1727203904.20473: running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 19285 1727203904.21010: in run() - task 028d2410-947f-f31b-fb3f-0000000000b1 19285 1727203904.21013: variable 'ansible_search_path' from source: unknown 19285 1727203904.21016: variable 'ansible_search_path' from source: unknown 19285 1727203904.21249: calling self._execute() 19285 1727203904.21444: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203904.21448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203904.21459: variable 'omit' from source: magic vars 19285 1727203904.22477: variable 'ansible_distribution' from source: facts 19285 1727203904.22489: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 19285 1727203904.22925: variable 'ansible_distribution_major_version' from source: facts 19285 1727203904.22929: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 19285 1727203904.22932: when evaluation is False, skipping this task 19285 1727203904.22935: _execute() done 19285 1727203904.22937: dumping result to json 19285 1727203904.22940: done dumping result, returning 19285 1727203904.22948: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 [028d2410-947f-f31b-fb3f-0000000000b1] 19285 1727203904.23091: sending task result for task 028d2410-947f-f31b-fb3f-0000000000b1 19285 1727203904.23385: done sending task result for task 028d2410-947f-f31b-fb3f-0000000000b1 19285 1727203904.23389: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 19285 1727203904.23440: no more pending results, returning what we have 19285 1727203904.23443: results queue empty 19285 1727203904.23445: checking for any_errors_fatal 19285 1727203904.23452: done checking for any_errors_fatal 19285 1727203904.23453: checking for max_fail_percentage 19285 1727203904.23455: done checking for max_fail_percentage 19285 1727203904.23456: checking to see if all hosts have failed and the running result is not ok 19285 1727203904.23457: done checking to see if all hosts have failed 19285 1727203904.23458: getting the remaining hosts for this loop 19285 1727203904.23459: done getting the remaining hosts for this loop 19285 1727203904.23463: getting the next task for host managed-node2 19285 1727203904.23471: done getting next task for host managed-node2 19285 1727203904.23473: ^ task is: TASK: Enable EPEL 8 19285 1727203904.23479: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203904.23483: getting variables 19285 1727203904.23485: in VariableManager get_vars() 19285 1727203904.23518: Calling all_inventory to load vars for managed-node2 19285 1727203904.23521: Calling groups_inventory to load vars for managed-node2 19285 1727203904.23525: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203904.23537: Calling all_plugins_play to load vars for managed-node2 19285 1727203904.23540: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203904.23542: Calling groups_plugins_play to load vars for managed-node2 19285 1727203904.23959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203904.24325: done with get_vars() 19285 1727203904.24334: done getting variables 19285 1727203904.24515: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 14:51:44 -0400 (0:00:00.053) 0:00:03.319 ***** 19285 1727203904.24546: entering _queue_task() for managed-node2/command 19285 1727203904.25250: worker is 1 (out of 1 available) 19285 1727203904.25259: exiting _queue_task() for managed-node2/command 19285 1727203904.25269: done queuing things up, now waiting for results queue to drain 19285 1727203904.25271: waiting for pending results... 19285 1727203904.25594: running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 19285 1727203904.25725: in run() - task 028d2410-947f-f31b-fb3f-0000000000b2 19285 1727203904.25784: variable 'ansible_search_path' from source: unknown 19285 1727203904.25802: variable 'ansible_search_path' from source: unknown 19285 1727203904.25879: calling self._execute() 19285 1727203904.26091: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203904.26099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203904.26113: variable 'omit' from source: magic vars 19285 1727203904.26895: variable 'ansible_distribution' from source: facts 19285 1727203904.26981: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 19285 1727203904.27243: variable 'ansible_distribution_major_version' from source: facts 19285 1727203904.27385: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 19285 1727203904.27388: when evaluation is False, skipping this task 19285 1727203904.27391: _execute() done 19285 1727203904.27393: dumping result to json 19285 1727203904.27395: done dumping result, returning 19285 1727203904.27398: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 [028d2410-947f-f31b-fb3f-0000000000b2] 19285 1727203904.27400: sending task result for task 028d2410-947f-f31b-fb3f-0000000000b2 19285 1727203904.27619: done sending task result for task 028d2410-947f-f31b-fb3f-0000000000b2 19285 1727203904.27623: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 19285 1727203904.27671: no more pending results, returning what we have 19285 1727203904.27674: results queue empty 19285 1727203904.27677: checking for any_errors_fatal 19285 1727203904.27682: done checking for any_errors_fatal 19285 1727203904.27683: checking for max_fail_percentage 19285 1727203904.27685: done checking for max_fail_percentage 19285 1727203904.27686: checking to see if all hosts have failed and the running result is not ok 19285 1727203904.27686: done checking to see if all hosts have failed 19285 1727203904.27687: getting the remaining hosts for this loop 19285 1727203904.27689: done getting the remaining hosts for this loop 19285 1727203904.27692: getting the next task for host managed-node2 19285 1727203904.27702: done getting next task for host managed-node2 19285 1727203904.27704: ^ task is: TASK: Enable EPEL 6 19285 1727203904.27708: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203904.27712: getting variables 19285 1727203904.27714: in VariableManager get_vars() 19285 1727203904.27964: Calling all_inventory to load vars for managed-node2 19285 1727203904.27967: Calling groups_inventory to load vars for managed-node2 19285 1727203904.27970: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203904.28058: Calling all_plugins_play to load vars for managed-node2 19285 1727203904.28062: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203904.28066: Calling groups_plugins_play to load vars for managed-node2 19285 1727203904.28356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203904.28768: done with get_vars() 19285 1727203904.28843: done getting variables 19285 1727203904.28903: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 14:51:44 -0400 (0:00:00.043) 0:00:03.363 ***** 19285 1727203904.28934: entering _queue_task() for managed-node2/copy 19285 1727203904.29699: worker is 1 (out of 1 available) 19285 1727203904.29708: exiting _queue_task() for managed-node2/copy 19285 1727203904.29718: done queuing things up, now waiting for results queue to drain 19285 1727203904.29720: waiting for pending results... 19285 1727203904.30066: running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 19285 1727203904.30221: in run() - task 028d2410-947f-f31b-fb3f-0000000000b4 19285 1727203904.30240: variable 'ansible_search_path' from source: unknown 19285 1727203904.30247: variable 'ansible_search_path' from source: unknown 19285 1727203904.30356: calling self._execute() 19285 1727203904.30581: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203904.30586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203904.30588: variable 'omit' from source: magic vars 19285 1727203904.31185: variable 'ansible_distribution' from source: facts 19285 1727203904.31206: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 19285 1727203904.31325: variable 'ansible_distribution_major_version' from source: facts 19285 1727203904.31337: Evaluated conditional (ansible_distribution_major_version == '6'): False 19285 1727203904.31345: when evaluation is False, skipping this task 19285 1727203904.31414: _execute() done 19285 1727203904.31418: dumping result to json 19285 1727203904.31420: done dumping result, returning 19285 1727203904.31423: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 [028d2410-947f-f31b-fb3f-0000000000b4] 19285 1727203904.31426: sending task result for task 028d2410-947f-f31b-fb3f-0000000000b4 19285 1727203904.31506: done sending task result for task 028d2410-947f-f31b-fb3f-0000000000b4 19285 1727203904.31509: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 19285 1727203904.31560: no more pending results, returning what we have 19285 1727203904.31564: results queue empty 19285 1727203904.31565: checking for any_errors_fatal 19285 1727203904.31570: done checking for any_errors_fatal 19285 1727203904.31571: checking for max_fail_percentage 19285 1727203904.31573: done checking for max_fail_percentage 19285 1727203904.31574: checking to see if all hosts have failed and the running result is not ok 19285 1727203904.31574: done checking to see if all hosts have failed 19285 1727203904.31576: getting the remaining hosts for this loop 19285 1727203904.31578: done getting the remaining hosts for this loop 19285 1727203904.31581: getting the next task for host managed-node2 19285 1727203904.31591: done getting next task for host managed-node2 19285 1727203904.31595: ^ task is: TASK: Set network provider to 'nm' 19285 1727203904.31597: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203904.31601: getting variables 19285 1727203904.31603: in VariableManager get_vars() 19285 1727203904.31634: Calling all_inventory to load vars for managed-node2 19285 1727203904.31637: Calling groups_inventory to load vars for managed-node2 19285 1727203904.31641: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203904.31654: Calling all_plugins_play to load vars for managed-node2 19285 1727203904.31657: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203904.31660: Calling groups_plugins_play to load vars for managed-node2 19285 1727203904.32105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203904.32295: done with get_vars() 19285 1727203904.32304: done getting variables 19285 1727203904.32363: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:13 Tuesday 24 September 2024 14:51:44 -0400 (0:00:00.034) 0:00:03.398 ***** 19285 1727203904.32391: entering _queue_task() for managed-node2/set_fact 19285 1727203904.32623: worker is 1 (out of 1 available) 19285 1727203904.32634: exiting _queue_task() for managed-node2/set_fact 19285 1727203904.32646: done queuing things up, now waiting for results queue to drain 19285 1727203904.32647: waiting for pending results... 19285 1727203904.32994: running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' 19285 1727203904.32999: in run() - task 028d2410-947f-f31b-fb3f-000000000007 19285 1727203904.33002: variable 'ansible_search_path' from source: unknown 19285 1727203904.33029: calling self._execute() 19285 1727203904.33113: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203904.33126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203904.33140: variable 'omit' from source: magic vars 19285 1727203904.33249: variable 'omit' from source: magic vars 19285 1727203904.33287: variable 'omit' from source: magic vars 19285 1727203904.33417: variable 'omit' from source: magic vars 19285 1727203904.33420: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203904.33424: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203904.33454: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203904.33480: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203904.33537: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203904.33605: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203904.33631: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203904.33641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203904.33799: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203904.33811: Set connection var ansible_pipelining to False 19285 1727203904.33820: Set connection var ansible_timeout to 10 19285 1727203904.33827: Set connection var ansible_shell_type to sh 19285 1727203904.33863: Set connection var ansible_shell_executable to /bin/sh 19285 1727203904.33881: Set connection var ansible_connection to ssh 19285 1727203904.33961: variable 'ansible_shell_executable' from source: unknown 19285 1727203904.33965: variable 'ansible_connection' from source: unknown 19285 1727203904.33971: variable 'ansible_module_compression' from source: unknown 19285 1727203904.33974: variable 'ansible_shell_type' from source: unknown 19285 1727203904.33978: variable 'ansible_shell_executable' from source: unknown 19285 1727203904.33981: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203904.33983: variable 'ansible_pipelining' from source: unknown 19285 1727203904.33986: variable 'ansible_timeout' from source: unknown 19285 1727203904.33988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203904.34179: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203904.34183: variable 'omit' from source: magic vars 19285 1727203904.34186: starting attempt loop 19285 1727203904.34188: running the handler 19285 1727203904.34191: handler run complete 19285 1727203904.34193: attempt loop complete, returning result 19285 1727203904.34195: _execute() done 19285 1727203904.34197: dumping result to json 19285 1727203904.34199: done dumping result, returning 19285 1727203904.34201: done running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' [028d2410-947f-f31b-fb3f-000000000007] 19285 1727203904.34205: sending task result for task 028d2410-947f-f31b-fb3f-000000000007 19285 1727203904.34538: done sending task result for task 028d2410-947f-f31b-fb3f-000000000007 19285 1727203904.34541: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 19285 1727203904.34602: no more pending results, returning what we have 19285 1727203904.34605: results queue empty 19285 1727203904.34606: checking for any_errors_fatal 19285 1727203904.34613: done checking for any_errors_fatal 19285 1727203904.34614: checking for max_fail_percentage 19285 1727203904.34615: done checking for max_fail_percentage 19285 1727203904.34616: checking to see if all hosts have failed and the running result is not ok 19285 1727203904.34617: done checking to see if all hosts have failed 19285 1727203904.34618: getting the remaining hosts for this loop 19285 1727203904.34619: done getting the remaining hosts for this loop 19285 1727203904.34623: getting the next task for host managed-node2 19285 1727203904.34630: done getting next task for host managed-node2 19285 1727203904.34633: ^ task is: TASK: meta (flush_handlers) 19285 1727203904.34634: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203904.34638: getting variables 19285 1727203904.34640: in VariableManager get_vars() 19285 1727203904.34784: Calling all_inventory to load vars for managed-node2 19285 1727203904.34787: Calling groups_inventory to load vars for managed-node2 19285 1727203904.34791: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203904.34799: Calling all_plugins_play to load vars for managed-node2 19285 1727203904.34802: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203904.34805: Calling groups_plugins_play to load vars for managed-node2 19285 1727203904.35184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203904.35676: done with get_vars() 19285 1727203904.35686: done getting variables 19285 1727203904.35748: in VariableManager get_vars() 19285 1727203904.35757: Calling all_inventory to load vars for managed-node2 19285 1727203904.35762: Calling groups_inventory to load vars for managed-node2 19285 1727203904.35764: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203904.35784: Calling all_plugins_play to load vars for managed-node2 19285 1727203904.35788: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203904.35792: Calling groups_plugins_play to load vars for managed-node2 19285 1727203904.36149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203904.36302: done with get_vars() 19285 1727203904.36318: done queuing things up, now waiting for results queue to drain 19285 1727203904.36320: results queue empty 19285 1727203904.36320: checking for any_errors_fatal 19285 1727203904.36322: done checking for any_errors_fatal 19285 1727203904.36323: checking for max_fail_percentage 19285 1727203904.36324: done checking for max_fail_percentage 19285 1727203904.36324: checking to see if all hosts have failed and the running result is not ok 19285 1727203904.36325: done checking to see if all hosts have failed 19285 1727203904.36326: getting the remaining hosts for this loop 19285 1727203904.36327: done getting the remaining hosts for this loop 19285 1727203904.36329: getting the next task for host managed-node2 19285 1727203904.36332: done getting next task for host managed-node2 19285 1727203904.36333: ^ task is: TASK: meta (flush_handlers) 19285 1727203904.36334: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203904.36340: getting variables 19285 1727203904.36341: in VariableManager get_vars() 19285 1727203904.36348: Calling all_inventory to load vars for managed-node2 19285 1727203904.36349: Calling groups_inventory to load vars for managed-node2 19285 1727203904.36351: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203904.36355: Calling all_plugins_play to load vars for managed-node2 19285 1727203904.36357: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203904.36361: Calling groups_plugins_play to load vars for managed-node2 19285 1727203904.36551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203904.36740: done with get_vars() 19285 1727203904.36753: done getting variables 19285 1727203904.36805: in VariableManager get_vars() 19285 1727203904.36813: Calling all_inventory to load vars for managed-node2 19285 1727203904.36815: Calling groups_inventory to load vars for managed-node2 19285 1727203904.36818: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203904.36822: Calling all_plugins_play to load vars for managed-node2 19285 1727203904.36824: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203904.36827: Calling groups_plugins_play to load vars for managed-node2 19285 1727203904.36990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203904.37168: done with get_vars() 19285 1727203904.37185: done queuing things up, now waiting for results queue to drain 19285 1727203904.37187: results queue empty 19285 1727203904.37188: checking for any_errors_fatal 19285 1727203904.37189: done checking for any_errors_fatal 19285 1727203904.37190: checking for max_fail_percentage 19285 1727203904.37191: done checking for max_fail_percentage 19285 1727203904.37191: checking to see if all hosts have failed and the running result is not ok 19285 1727203904.37192: done checking to see if all hosts have failed 19285 1727203904.37193: getting the remaining hosts for this loop 19285 1727203904.37194: done getting the remaining hosts for this loop 19285 1727203904.37196: getting the next task for host managed-node2 19285 1727203904.37199: done getting next task for host managed-node2 19285 1727203904.37200: ^ task is: None 19285 1727203904.37201: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203904.37203: done queuing things up, now waiting for results queue to drain 19285 1727203904.37203: results queue empty 19285 1727203904.37204: checking for any_errors_fatal 19285 1727203904.37205: done checking for any_errors_fatal 19285 1727203904.37205: checking for max_fail_percentage 19285 1727203904.37206: done checking for max_fail_percentage 19285 1727203904.37207: checking to see if all hosts have failed and the running result is not ok 19285 1727203904.37207: done checking to see if all hosts have failed 19285 1727203904.37209: getting the next task for host managed-node2 19285 1727203904.37211: done getting next task for host managed-node2 19285 1727203904.37212: ^ task is: None 19285 1727203904.37213: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203904.37254: in VariableManager get_vars() 19285 1727203904.37273: done with get_vars() 19285 1727203904.37282: in VariableManager get_vars() 19285 1727203904.37305: done with get_vars() 19285 1727203904.37310: variable 'omit' from source: magic vars 19285 1727203904.37341: in VariableManager get_vars() 19285 1727203904.37351: done with get_vars() 19285 1727203904.37377: variable 'omit' from source: magic vars PLAY [Test configuring bridges] ************************************************ 19285 1727203904.37593: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19285 1727203904.37624: getting the remaining hosts for this loop 19285 1727203904.37625: done getting the remaining hosts for this loop 19285 1727203904.37628: getting the next task for host managed-node2 19285 1727203904.37631: done getting next task for host managed-node2 19285 1727203904.37633: ^ task is: TASK: Gathering Facts 19285 1727203904.37634: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203904.37636: getting variables 19285 1727203904.37637: in VariableManager get_vars() 19285 1727203904.37644: Calling all_inventory to load vars for managed-node2 19285 1727203904.37646: Calling groups_inventory to load vars for managed-node2 19285 1727203904.37648: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203904.37653: Calling all_plugins_play to load vars for managed-node2 19285 1727203904.37735: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203904.37740: Calling groups_plugins_play to load vars for managed-node2 19285 1727203904.37874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203904.38173: done with get_vars() 19285 1727203904.38183: done getting variables 19285 1727203904.38246: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:3 Tuesday 24 September 2024 14:51:44 -0400 (0:00:00.058) 0:00:03.457 ***** 19285 1727203904.38273: entering _queue_task() for managed-node2/gather_facts 19285 1727203904.38703: worker is 1 (out of 1 available) 19285 1727203904.38713: exiting _queue_task() for managed-node2/gather_facts 19285 1727203904.38721: done queuing things up, now waiting for results queue to drain 19285 1727203904.38723: waiting for pending results... 19285 1727203904.38829: running TaskExecutor() for managed-node2/TASK: Gathering Facts 19285 1727203904.38948: in run() - task 028d2410-947f-f31b-fb3f-0000000000da 19285 1727203904.38952: variable 'ansible_search_path' from source: unknown 19285 1727203904.38989: calling self._execute() 19285 1727203904.39073: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203904.39085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203904.39099: variable 'omit' from source: magic vars 19285 1727203904.39481: variable 'ansible_distribution_major_version' from source: facts 19285 1727203904.39504: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203904.39514: variable 'omit' from source: magic vars 19285 1727203904.39544: variable 'omit' from source: magic vars 19285 1727203904.39586: variable 'omit' from source: magic vars 19285 1727203904.39714: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203904.39718: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203904.39720: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203904.39734: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203904.39756: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203904.39816: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203904.39819: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203904.39821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203904.39928: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203904.39944: Set connection var ansible_pipelining to False 19285 1727203904.39979: Set connection var ansible_timeout to 10 19285 1727203904.39982: Set connection var ansible_shell_type to sh 19285 1727203904.39984: Set connection var ansible_shell_executable to /bin/sh 19285 1727203904.39986: Set connection var ansible_connection to ssh 19285 1727203904.40001: variable 'ansible_shell_executable' from source: unknown 19285 1727203904.40008: variable 'ansible_connection' from source: unknown 19285 1727203904.40014: variable 'ansible_module_compression' from source: unknown 19285 1727203904.40021: variable 'ansible_shell_type' from source: unknown 19285 1727203904.40051: variable 'ansible_shell_executable' from source: unknown 19285 1727203904.40054: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203904.40056: variable 'ansible_pipelining' from source: unknown 19285 1727203904.40061: variable 'ansible_timeout' from source: unknown 19285 1727203904.40063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203904.40241: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203904.40271: variable 'omit' from source: magic vars 19285 1727203904.40280: starting attempt loop 19285 1727203904.40286: running the handler 19285 1727203904.40357: variable 'ansible_facts' from source: unknown 19285 1727203904.40363: _low_level_execute_command(): starting 19285 1727203904.40365: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203904.41129: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203904.41192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203904.41220: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203904.41262: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203904.41339: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19285 1727203904.43782: stdout chunk (state=3): >>>/root <<< 19285 1727203904.43786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203904.43789: stdout chunk (state=3): >>><<< 19285 1727203904.43791: stderr chunk (state=3): >>><<< 19285 1727203904.43816: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 19285 1727203904.44070: _low_level_execute_command(): starting 19285 1727203904.44074: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203904.439841-19650-207453097347907 `" && echo ansible-tmp-1727203904.439841-19650-207453097347907="` echo /root/.ansible/tmp/ansible-tmp-1727203904.439841-19650-207453097347907 `" ) && sleep 0' 19285 1727203904.45101: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203904.45114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203904.45173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203904.45392: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203904.45485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19285 1727203904.48304: stdout chunk (state=3): >>>ansible-tmp-1727203904.439841-19650-207453097347907=/root/.ansible/tmp/ansible-tmp-1727203904.439841-19650-207453097347907 <<< 19285 1727203904.48378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203904.48429: stderr chunk (state=3): >>><<< 19285 1727203904.48592: stdout chunk (state=3): >>><<< 19285 1727203904.48640: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203904.439841-19650-207453097347907=/root/.ansible/tmp/ansible-tmp-1727203904.439841-19650-207453097347907 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 19285 1727203904.48678: variable 'ansible_module_compression' from source: unknown 19285 1727203904.48737: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19285 1727203904.48805: variable 'ansible_facts' from source: unknown 19285 1727203904.49013: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203904.439841-19650-207453097347907/AnsiballZ_setup.py 19285 1727203904.49337: Sending initial data 19285 1727203904.49340: Sent initial data (153 bytes) 19285 1727203904.49919: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203904.49931: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203904.50040: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19285 1727203904.52279: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203904.52347: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203904.52435: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmpde6_l11_ /root/.ansible/tmp/ansible-tmp-1727203904.439841-19650-207453097347907/AnsiballZ_setup.py <<< 19285 1727203904.52439: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203904.439841-19650-207453097347907/AnsiballZ_setup.py" <<< 19285 1727203904.52499: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmpde6_l11_" to remote "/root/.ansible/tmp/ansible-tmp-1727203904.439841-19650-207453097347907/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203904.439841-19650-207453097347907/AnsiballZ_setup.py" <<< 19285 1727203904.54381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203904.54388: stdout chunk (state=3): >>><<< 19285 1727203904.54391: stderr chunk (state=3): >>><<< 19285 1727203904.54393: done transferring module to remote 19285 1727203904.54395: _low_level_execute_command(): starting 19285 1727203904.54397: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203904.439841-19650-207453097347907/ /root/.ansible/tmp/ansible-tmp-1727203904.439841-19650-207453097347907/AnsiballZ_setup.py && sleep 0' 19285 1727203904.55088: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203904.55112: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203904.55136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203904.55158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203904.55292: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203904.55557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203904.55606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203904.55805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19285 1727203904.58366: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203904.58372: stdout chunk (state=3): >>><<< 19285 1727203904.58381: stderr chunk (state=3): >>><<< 19285 1727203904.58554: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 19285 1727203904.58561: _low_level_execute_command(): starting 19285 1727203904.58564: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203904.439841-19650-207453097347907/AnsiballZ_setup.py && sleep 0' 19285 1727203904.59452: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203904.59471: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203904.59565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203904.59600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203904.59711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19285 1727203905.40023: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.46923828125, "5m": 0.3798828125, "15m": 0.19140625}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "a<<< 19285 1727203905.40079: stdout chunk (state=3): >>>nsible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "51", "second": "45", "epoch": "1727203905", "epoch_int": "1727203905", "date": "2024-09-24", "time": "14:51:45", "iso8601_micro": "2024-09-24T18:51:45.010761Z", "iso8601": "2024-09-24T18:51:45Z", "iso8601_basic": "20240924T145145010761", "iso8601_basic_short": "20240924T145145", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2916, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 615, "free": 2916}, "nocache": {"free": 3272, "used": 259}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 491, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261787971584, "block_size": 4096, "block_total": 65519099, "block_available": 63913079, "block_used": 1606020, "inode_total": 131070960, "inode_available": 131027260, "inode_used": 43700, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19285 1727203905.42710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203905.42714: stdout chunk (state=3): >>><<< 19285 1727203905.42717: stderr chunk (state=3): >>><<< 19285 1727203905.42882: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.46923828125, "5m": 0.3798828125, "15m": 0.19140625}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "51", "second": "45", "epoch": "1727203905", "epoch_int": "1727203905", "date": "2024-09-24", "time": "14:51:45", "iso8601_micro": "2024-09-24T18:51:45.010761Z", "iso8601": "2024-09-24T18:51:45Z", "iso8601_basic": "20240924T145145010761", "iso8601_basic_short": "20240924T145145", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2916, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 615, "free": 2916}, "nocache": {"free": 3272, "used": 259}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 491, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261787971584, "block_size": 4096, "block_total": 65519099, "block_available": 63913079, "block_used": 1606020, "inode_total": 131070960, "inode_available": 131027260, "inode_used": 43700, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203905.43168: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203904.439841-19650-207453097347907/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203905.43200: _low_level_execute_command(): starting 19285 1727203905.43215: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203904.439841-19650-207453097347907/ > /dev/null 2>&1 && sleep 0' 19285 1727203905.43904: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203905.43920: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203905.43936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203905.44036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203905.44058: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203905.44167: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19285 1727203905.46930: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203905.46942: stdout chunk (state=3): >>><<< 19285 1727203905.46954: stderr chunk (state=3): >>><<< 19285 1727203905.47002: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 19285 1727203905.47025: handler run complete 19285 1727203905.47327: variable 'ansible_facts' from source: unknown 19285 1727203905.47763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203905.48162: variable 'ansible_facts' from source: unknown 19285 1727203905.48251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203905.48400: attempt loop complete, returning result 19285 1727203905.48409: _execute() done 19285 1727203905.48418: dumping result to json 19285 1727203905.48452: done dumping result, returning 19285 1727203905.48524: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [028d2410-947f-f31b-fb3f-0000000000da] 19285 1727203905.48527: sending task result for task 028d2410-947f-f31b-fb3f-0000000000da ok: [managed-node2] 19285 1727203905.49389: no more pending results, returning what we have 19285 1727203905.49392: results queue empty 19285 1727203905.49392: checking for any_errors_fatal 19285 1727203905.49394: done checking for any_errors_fatal 19285 1727203905.49394: checking for max_fail_percentage 19285 1727203905.49396: done checking for max_fail_percentage 19285 1727203905.49397: checking to see if all hosts have failed and the running result is not ok 19285 1727203905.49397: done checking to see if all hosts have failed 19285 1727203905.49398: getting the remaining hosts for this loop 19285 1727203905.49399: done getting the remaining hosts for this loop 19285 1727203905.49402: getting the next task for host managed-node2 19285 1727203905.49408: done getting next task for host managed-node2 19285 1727203905.49409: ^ task is: TASK: meta (flush_handlers) 19285 1727203905.49411: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203905.49415: getting variables 19285 1727203905.49416: in VariableManager get_vars() 19285 1727203905.49438: Calling all_inventory to load vars for managed-node2 19285 1727203905.49440: Calling groups_inventory to load vars for managed-node2 19285 1727203905.49444: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203905.49450: done sending task result for task 028d2410-947f-f31b-fb3f-0000000000da 19285 1727203905.49454: WORKER PROCESS EXITING 19285 1727203905.49463: Calling all_plugins_play to load vars for managed-node2 19285 1727203905.49466: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203905.49469: Calling groups_plugins_play to load vars for managed-node2 19285 1727203905.49632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203905.49812: done with get_vars() 19285 1727203905.49822: done getting variables 19285 1727203905.49895: in VariableManager get_vars() 19285 1727203905.49904: Calling all_inventory to load vars for managed-node2 19285 1727203905.49906: Calling groups_inventory to load vars for managed-node2 19285 1727203905.49909: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203905.49913: Calling all_plugins_play to load vars for managed-node2 19285 1727203905.49915: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203905.49918: Calling groups_plugins_play to load vars for managed-node2 19285 1727203905.50048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203905.50227: done with get_vars() 19285 1727203905.50239: done queuing things up, now waiting for results queue to drain 19285 1727203905.50240: results queue empty 19285 1727203905.50241: checking for any_errors_fatal 19285 1727203905.50244: done checking for any_errors_fatal 19285 1727203905.50244: checking for max_fail_percentage 19285 1727203905.50245: done checking for max_fail_percentage 19285 1727203905.50246: checking to see if all hosts have failed and the running result is not ok 19285 1727203905.50252: done checking to see if all hosts have failed 19285 1727203905.50253: getting the remaining hosts for this loop 19285 1727203905.50254: done getting the remaining hosts for this loop 19285 1727203905.50256: getting the next task for host managed-node2 19285 1727203905.50260: done getting next task for host managed-node2 19285 1727203905.50262: ^ task is: TASK: Set interface={{ interface }} 19285 1727203905.50263: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203905.50265: getting variables 19285 1727203905.50266: in VariableManager get_vars() 19285 1727203905.50273: Calling all_inventory to load vars for managed-node2 19285 1727203905.50277: Calling groups_inventory to load vars for managed-node2 19285 1727203905.50279: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203905.50284: Calling all_plugins_play to load vars for managed-node2 19285 1727203905.50286: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203905.50289: Calling groups_plugins_play to load vars for managed-node2 19285 1727203905.50416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203905.50615: done with get_vars() 19285 1727203905.50622: done getting variables 19285 1727203905.50662: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19285 1727203905.50778: variable 'interface' from source: play vars TASK [Set interface=LSR-TST-br31] ********************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:9 Tuesday 24 September 2024 14:51:45 -0400 (0:00:01.125) 0:00:04.582 ***** 19285 1727203905.50816: entering _queue_task() for managed-node2/set_fact 19285 1727203905.51282: worker is 1 (out of 1 available) 19285 1727203905.51290: exiting _queue_task() for managed-node2/set_fact 19285 1727203905.51300: done queuing things up, now waiting for results queue to drain 19285 1727203905.51301: waiting for pending results... 19285 1727203905.51342: running TaskExecutor() for managed-node2/TASK: Set interface=LSR-TST-br31 19285 1727203905.51526: in run() - task 028d2410-947f-f31b-fb3f-00000000000b 19285 1727203905.51530: variable 'ansible_search_path' from source: unknown 19285 1727203905.51533: calling self._execute() 19285 1727203905.51570: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203905.51583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203905.51599: variable 'omit' from source: magic vars 19285 1727203905.51940: variable 'ansible_distribution_major_version' from source: facts 19285 1727203905.51959: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203905.51969: variable 'omit' from source: magic vars 19285 1727203905.52000: variable 'omit' from source: magic vars 19285 1727203905.52030: variable 'interface' from source: play vars 19285 1727203905.52108: variable 'interface' from source: play vars 19285 1727203905.52132: variable 'omit' from source: magic vars 19285 1727203905.52382: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203905.52387: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203905.52390: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203905.52415: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203905.52506: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203905.52532: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203905.52542: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203905.52551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203905.52781: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203905.52784: Set connection var ansible_pipelining to False 19285 1727203905.52787: Set connection var ansible_timeout to 10 19285 1727203905.52789: Set connection var ansible_shell_type to sh 19285 1727203905.52791: Set connection var ansible_shell_executable to /bin/sh 19285 1727203905.52793: Set connection var ansible_connection to ssh 19285 1727203905.52891: variable 'ansible_shell_executable' from source: unknown 19285 1727203905.52921: variable 'ansible_connection' from source: unknown 19285 1727203905.52928: variable 'ansible_module_compression' from source: unknown 19285 1727203905.52941: variable 'ansible_shell_type' from source: unknown 19285 1727203905.52957: variable 'ansible_shell_executable' from source: unknown 19285 1727203905.53046: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203905.53049: variable 'ansible_pipelining' from source: unknown 19285 1727203905.53052: variable 'ansible_timeout' from source: unknown 19285 1727203905.53054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203905.53305: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203905.53321: variable 'omit' from source: magic vars 19285 1727203905.53514: starting attempt loop 19285 1727203905.53518: running the handler 19285 1727203905.53520: handler run complete 19285 1727203905.53522: attempt loop complete, returning result 19285 1727203905.53524: _execute() done 19285 1727203905.53526: dumping result to json 19285 1727203905.53528: done dumping result, returning 19285 1727203905.53530: done running TaskExecutor() for managed-node2/TASK: Set interface=LSR-TST-br31 [028d2410-947f-f31b-fb3f-00000000000b] 19285 1727203905.53532: sending task result for task 028d2410-947f-f31b-fb3f-00000000000b 19285 1727203905.53596: done sending task result for task 028d2410-947f-f31b-fb3f-00000000000b 19285 1727203905.53599: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "interface": "LSR-TST-br31" }, "changed": false } 19285 1727203905.53654: no more pending results, returning what we have 19285 1727203905.53657: results queue empty 19285 1727203905.53658: checking for any_errors_fatal 19285 1727203905.53660: done checking for any_errors_fatal 19285 1727203905.53661: checking for max_fail_percentage 19285 1727203905.53662: done checking for max_fail_percentage 19285 1727203905.53663: checking to see if all hosts have failed and the running result is not ok 19285 1727203905.53664: done checking to see if all hosts have failed 19285 1727203905.53665: getting the remaining hosts for this loop 19285 1727203905.53667: done getting the remaining hosts for this loop 19285 1727203905.53671: getting the next task for host managed-node2 19285 1727203905.53679: done getting next task for host managed-node2 19285 1727203905.53682: ^ task is: TASK: Include the task 'show_interfaces.yml' 19285 1727203905.53684: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203905.53688: getting variables 19285 1727203905.53689: in VariableManager get_vars() 19285 1727203905.53717: Calling all_inventory to load vars for managed-node2 19285 1727203905.53720: Calling groups_inventory to load vars for managed-node2 19285 1727203905.53724: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203905.53735: Calling all_plugins_play to load vars for managed-node2 19285 1727203905.53738: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203905.53741: Calling groups_plugins_play to load vars for managed-node2 19285 1727203905.54061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203905.54813: done with get_vars() 19285 1727203905.54823: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:12 Tuesday 24 September 2024 14:51:45 -0400 (0:00:00.040) 0:00:04.623 ***** 19285 1727203905.54911: entering _queue_task() for managed-node2/include_tasks 19285 1727203905.55566: worker is 1 (out of 1 available) 19285 1727203905.55580: exiting _queue_task() for managed-node2/include_tasks 19285 1727203905.55594: done queuing things up, now waiting for results queue to drain 19285 1727203905.55595: waiting for pending results... 19285 1727203905.56094: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 19285 1727203905.56220: in run() - task 028d2410-947f-f31b-fb3f-00000000000c 19285 1727203905.56224: variable 'ansible_search_path' from source: unknown 19285 1727203905.56266: calling self._execute() 19285 1727203905.56471: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203905.56477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203905.56480: variable 'omit' from source: magic vars 19285 1727203905.56863: variable 'ansible_distribution_major_version' from source: facts 19285 1727203905.56893: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203905.56906: _execute() done 19285 1727203905.56914: dumping result to json 19285 1727203905.56922: done dumping result, returning 19285 1727203905.56933: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [028d2410-947f-f31b-fb3f-00000000000c] 19285 1727203905.56943: sending task result for task 028d2410-947f-f31b-fb3f-00000000000c 19285 1727203905.57069: done sending task result for task 028d2410-947f-f31b-fb3f-00000000000c 19285 1727203905.57073: WORKER PROCESS EXITING 19285 1727203905.57109: no more pending results, returning what we have 19285 1727203905.57114: in VariableManager get_vars() 19285 1727203905.57145: Calling all_inventory to load vars for managed-node2 19285 1727203905.57148: Calling groups_inventory to load vars for managed-node2 19285 1727203905.57151: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203905.57164: Calling all_plugins_play to load vars for managed-node2 19285 1727203905.57166: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203905.57169: Calling groups_plugins_play to load vars for managed-node2 19285 1727203905.57474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203905.57692: done with get_vars() 19285 1727203905.57699: variable 'ansible_search_path' from source: unknown 19285 1727203905.57712: we have included files to process 19285 1727203905.57713: generating all_blocks data 19285 1727203905.57715: done generating all_blocks data 19285 1727203905.57715: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 19285 1727203905.57716: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 19285 1727203905.57719: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 19285 1727203905.57900: in VariableManager get_vars() 19285 1727203905.57915: done with get_vars() 19285 1727203905.58036: done processing included file 19285 1727203905.58038: iterating over new_blocks loaded from include file 19285 1727203905.58040: in VariableManager get_vars() 19285 1727203905.58050: done with get_vars() 19285 1727203905.58052: filtering new block on tags 19285 1727203905.58071: done filtering new block on tags 19285 1727203905.58074: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 19285 1727203905.58091: extending task lists for all hosts with included blocks 19285 1727203905.58170: done extending task lists 19285 1727203905.58171: done processing included files 19285 1727203905.58172: results queue empty 19285 1727203905.58173: checking for any_errors_fatal 19285 1727203905.58263: done checking for any_errors_fatal 19285 1727203905.58265: checking for max_fail_percentage 19285 1727203905.58266: done checking for max_fail_percentage 19285 1727203905.58266: checking to see if all hosts have failed and the running result is not ok 19285 1727203905.58267: done checking to see if all hosts have failed 19285 1727203905.58268: getting the remaining hosts for this loop 19285 1727203905.58269: done getting the remaining hosts for this loop 19285 1727203905.58272: getting the next task for host managed-node2 19285 1727203905.58278: done getting next task for host managed-node2 19285 1727203905.58280: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 19285 1727203905.58282: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203905.58285: getting variables 19285 1727203905.58286: in VariableManager get_vars() 19285 1727203905.58294: Calling all_inventory to load vars for managed-node2 19285 1727203905.58308: Calling groups_inventory to load vars for managed-node2 19285 1727203905.58311: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203905.58316: Calling all_plugins_play to load vars for managed-node2 19285 1727203905.58319: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203905.58322: Calling groups_plugins_play to load vars for managed-node2 19285 1727203905.58474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203905.58713: done with get_vars() 19285 1727203905.58722: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:51:45 -0400 (0:00:00.040) 0:00:04.663 ***** 19285 1727203905.58918: entering _queue_task() for managed-node2/include_tasks 19285 1727203905.59624: worker is 1 (out of 1 available) 19285 1727203905.59634: exiting _queue_task() for managed-node2/include_tasks 19285 1727203905.59644: done queuing things up, now waiting for results queue to drain 19285 1727203905.59645: waiting for pending results... 19285 1727203905.60181: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 19285 1727203905.60213: in run() - task 028d2410-947f-f31b-fb3f-0000000000ee 19285 1727203905.60247: variable 'ansible_search_path' from source: unknown 19285 1727203905.60273: variable 'ansible_search_path' from source: unknown 19285 1727203905.60389: calling self._execute() 19285 1727203905.60477: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203905.60494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203905.60509: variable 'omit' from source: magic vars 19285 1727203905.60964: variable 'ansible_distribution_major_version' from source: facts 19285 1727203905.60983: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203905.60993: _execute() done 19285 1727203905.61001: dumping result to json 19285 1727203905.61007: done dumping result, returning 19285 1727203905.61035: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [028d2410-947f-f31b-fb3f-0000000000ee] 19285 1727203905.61037: sending task result for task 028d2410-947f-f31b-fb3f-0000000000ee 19285 1727203905.61152: no more pending results, returning what we have 19285 1727203905.61157: in VariableManager get_vars() 19285 1727203905.61203: Calling all_inventory to load vars for managed-node2 19285 1727203905.61206: Calling groups_inventory to load vars for managed-node2 19285 1727203905.61210: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203905.61224: Calling all_plugins_play to load vars for managed-node2 19285 1727203905.61226: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203905.61229: Calling groups_plugins_play to load vars for managed-node2 19285 1727203905.61649: done sending task result for task 028d2410-947f-f31b-fb3f-0000000000ee 19285 1727203905.61652: WORKER PROCESS EXITING 19285 1727203905.61678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203905.61899: done with get_vars() 19285 1727203905.61906: variable 'ansible_search_path' from source: unknown 19285 1727203905.61907: variable 'ansible_search_path' from source: unknown 19285 1727203905.61956: we have included files to process 19285 1727203905.61957: generating all_blocks data 19285 1727203905.61962: done generating all_blocks data 19285 1727203905.61963: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 19285 1727203905.61964: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 19285 1727203905.61967: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 19285 1727203905.62314: done processing included file 19285 1727203905.62316: iterating over new_blocks loaded from include file 19285 1727203905.62317: in VariableManager get_vars() 19285 1727203905.62330: done with get_vars() 19285 1727203905.62331: filtering new block on tags 19285 1727203905.62436: done filtering new block on tags 19285 1727203905.62439: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 19285 1727203905.62444: extending task lists for all hosts with included blocks 19285 1727203905.62565: done extending task lists 19285 1727203905.62573: done processing included files 19285 1727203905.62579: results queue empty 19285 1727203905.62579: checking for any_errors_fatal 19285 1727203905.62582: done checking for any_errors_fatal 19285 1727203905.62583: checking for max_fail_percentage 19285 1727203905.62584: done checking for max_fail_percentage 19285 1727203905.62585: checking to see if all hosts have failed and the running result is not ok 19285 1727203905.62585: done checking to see if all hosts have failed 19285 1727203905.62586: getting the remaining hosts for this loop 19285 1727203905.62587: done getting the remaining hosts for this loop 19285 1727203905.62590: getting the next task for host managed-node2 19285 1727203905.62594: done getting next task for host managed-node2 19285 1727203905.62596: ^ task is: TASK: Gather current interface info 19285 1727203905.62599: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203905.62601: getting variables 19285 1727203905.62602: in VariableManager get_vars() 19285 1727203905.62614: Calling all_inventory to load vars for managed-node2 19285 1727203905.62616: Calling groups_inventory to load vars for managed-node2 19285 1727203905.62618: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203905.62623: Calling all_plugins_play to load vars for managed-node2 19285 1727203905.62626: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203905.62628: Calling groups_plugins_play to load vars for managed-node2 19285 1727203905.62832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203905.63049: done with get_vars() 19285 1727203905.63061: done getting variables 19285 1727203905.63102: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:51:45 -0400 (0:00:00.042) 0:00:04.705 ***** 19285 1727203905.63140: entering _queue_task() for managed-node2/command 19285 1727203905.63558: worker is 1 (out of 1 available) 19285 1727203905.63572: exiting _queue_task() for managed-node2/command 19285 1727203905.63592: done queuing things up, now waiting for results queue to drain 19285 1727203905.63594: waiting for pending results... 19285 1727203905.63762: running TaskExecutor() for managed-node2/TASK: Gather current interface info 19285 1727203905.63892: in run() - task 028d2410-947f-f31b-fb3f-0000000000fd 19285 1727203905.63923: variable 'ansible_search_path' from source: unknown 19285 1727203905.63934: variable 'ansible_search_path' from source: unknown 19285 1727203905.63980: calling self._execute() 19285 1727203905.64085: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203905.64102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203905.64118: variable 'omit' from source: magic vars 19285 1727203905.64553: variable 'ansible_distribution_major_version' from source: facts 19285 1727203905.64586: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203905.64598: variable 'omit' from source: magic vars 19285 1727203905.64650: variable 'omit' from source: magic vars 19285 1727203905.64754: variable 'omit' from source: magic vars 19285 1727203905.64758: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203905.64813: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203905.64838: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203905.64868: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203905.64894: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203905.64931: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203905.64940: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203905.64973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203905.65080: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203905.65095: Set connection var ansible_pipelining to False 19285 1727203905.65179: Set connection var ansible_timeout to 10 19285 1727203905.65187: Set connection var ansible_shell_type to sh 19285 1727203905.65190: Set connection var ansible_shell_executable to /bin/sh 19285 1727203905.65192: Set connection var ansible_connection to ssh 19285 1727203905.65193: variable 'ansible_shell_executable' from source: unknown 19285 1727203905.65195: variable 'ansible_connection' from source: unknown 19285 1727203905.65197: variable 'ansible_module_compression' from source: unknown 19285 1727203905.65199: variable 'ansible_shell_type' from source: unknown 19285 1727203905.65201: variable 'ansible_shell_executable' from source: unknown 19285 1727203905.65202: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203905.65204: variable 'ansible_pipelining' from source: unknown 19285 1727203905.65206: variable 'ansible_timeout' from source: unknown 19285 1727203905.65208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203905.65376: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203905.65396: variable 'omit' from source: magic vars 19285 1727203905.65438: starting attempt loop 19285 1727203905.65441: running the handler 19285 1727203905.65517: _low_level_execute_command(): starting 19285 1727203905.65520: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203905.66413: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203905.66436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203905.66472: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203905.66514: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203905.66649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19285 1727203905.69006: stdout chunk (state=3): >>>/root <<< 19285 1727203905.69183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203905.69303: stderr chunk (state=3): >>><<< 19285 1727203905.69306: stdout chunk (state=3): >>><<< 19285 1727203905.69680: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 19285 1727203905.69684: _low_level_execute_command(): starting 19285 1727203905.69687: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203905.6957812-19711-269759400650997 `" && echo ansible-tmp-1727203905.6957812-19711-269759400650997="` echo /root/.ansible/tmp/ansible-tmp-1727203905.6957812-19711-269759400650997 `" ) && sleep 0' 19285 1727203905.70665: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203905.70669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203905.70736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203905.70761: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203905.70772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 19285 1727203905.70843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203905.70991: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203905.71103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19285 1727203905.73836: stdout chunk (state=3): >>>ansible-tmp-1727203905.6957812-19711-269759400650997=/root/.ansible/tmp/ansible-tmp-1727203905.6957812-19711-269759400650997 <<< 19285 1727203905.74218: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203905.74222: stdout chunk (state=3): >>><<< 19285 1727203905.74224: stderr chunk (state=3): >>><<< 19285 1727203905.74226: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203905.6957812-19711-269759400650997=/root/.ansible/tmp/ansible-tmp-1727203905.6957812-19711-269759400650997 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 19285 1727203905.74254: variable 'ansible_module_compression' from source: unknown 19285 1727203905.74308: ANSIBALLZ: Using generic lock for ansible.legacy.command 19285 1727203905.74311: ANSIBALLZ: Acquiring lock 19285 1727203905.74314: ANSIBALLZ: Lock acquired: 140487240913488 19285 1727203905.74316: ANSIBALLZ: Creating module 19285 1727203905.86348: ANSIBALLZ: Writing module into payload 19285 1727203905.86465: ANSIBALLZ: Writing module 19285 1727203905.86489: ANSIBALLZ: Renaming module 19285 1727203905.86496: ANSIBALLZ: Done creating module 19285 1727203905.86505: variable 'ansible_facts' from source: unknown 19285 1727203905.86561: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203905.6957812-19711-269759400650997/AnsiballZ_command.py 19285 1727203905.86682: Sending initial data 19285 1727203905.86686: Sent initial data (156 bytes) 19285 1727203905.87428: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203905.87448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203905.87559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19285 1727203905.89385: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203905.89431: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203905.89606: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmp3iuyg5f6 /root/.ansible/tmp/ansible-tmp-1727203905.6957812-19711-269759400650997/AnsiballZ_command.py <<< 19285 1727203905.89610: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203905.6957812-19711-269759400650997/AnsiballZ_command.py" <<< 19285 1727203905.89798: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmp3iuyg5f6" to remote "/root/.ansible/tmp/ansible-tmp-1727203905.6957812-19711-269759400650997/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203905.6957812-19711-269759400650997/AnsiballZ_command.py" <<< 19285 1727203905.91383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203905.91387: stdout chunk (state=3): >>><<< 19285 1727203905.91389: stderr chunk (state=3): >>><<< 19285 1727203905.91392: done transferring module to remote 19285 1727203905.91394: _low_level_execute_command(): starting 19285 1727203905.91396: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203905.6957812-19711-269759400650997/ /root/.ansible/tmp/ansible-tmp-1727203905.6957812-19711-269759400650997/AnsiballZ_command.py && sleep 0' 19285 1727203905.92377: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 19285 1727203905.92397: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203905.92578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203905.92666: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203905.92884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203905.95270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203905.95295: stderr chunk (state=3): >>><<< 19285 1727203905.95310: stdout chunk (state=3): >>><<< 19285 1727203905.95331: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203905.95336: _low_level_execute_command(): starting 19285 1727203905.95343: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203905.6957812-19711-269759400650997/AnsiballZ_command.py && sleep 0' 19285 1727203905.95992: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203905.96014: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203905.96084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203905.96098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203905.96148: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203905.96164: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203905.96393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203905.96547: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203906.20102: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:51:46.195072", "end": "2024-09-24 14:51:46.199966", "delta": "0:00:00.004894", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 19285 1727203906.22357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203906.22364: stdout chunk (state=3): >>><<< 19285 1727203906.22366: stderr chunk (state=3): >>><<< 19285 1727203906.22436: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:51:46.195072", "end": "2024-09-24 14:51:46.199966", "delta": "0:00:00.004894", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203906.22440: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203905.6957812-19711-269759400650997/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203906.22451: _low_level_execute_command(): starting 19285 1727203906.22465: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203905.6957812-19711-269759400650997/ > /dev/null 2>&1 && sleep 0' 19285 1727203906.23187: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203906.23231: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 19285 1727203906.23306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203906.23320: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203906.23373: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203906.23395: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203906.23429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203906.23563: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203906.26382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203906.26390: stdout chunk (state=3): >>><<< 19285 1727203906.26392: stderr chunk (state=3): >>><<< 19285 1727203906.26395: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203906.26397: handler run complete 19285 1727203906.26399: Evaluated conditional (False): False 19285 1727203906.26401: attempt loop complete, returning result 19285 1727203906.26403: _execute() done 19285 1727203906.26404: dumping result to json 19285 1727203906.26406: done dumping result, returning 19285 1727203906.26408: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [028d2410-947f-f31b-fb3f-0000000000fd] 19285 1727203906.26410: sending task result for task 028d2410-947f-f31b-fb3f-0000000000fd 19285 1727203906.26583: done sending task result for task 028d2410-947f-f31b-fb3f-0000000000fd ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.004894", "end": "2024-09-24 14:51:46.199966", "rc": 0, "start": "2024-09-24 14:51:46.195072" } STDOUT: bonding_masters eth0 lo 19285 1727203906.26664: no more pending results, returning what we have 19285 1727203906.26668: results queue empty 19285 1727203906.26669: checking for any_errors_fatal 19285 1727203906.26671: done checking for any_errors_fatal 19285 1727203906.26672: checking for max_fail_percentage 19285 1727203906.26674: done checking for max_fail_percentage 19285 1727203906.26677: checking to see if all hosts have failed and the running result is not ok 19285 1727203906.26678: done checking to see if all hosts have failed 19285 1727203906.26679: getting the remaining hosts for this loop 19285 1727203906.26680: done getting the remaining hosts for this loop 19285 1727203906.26684: getting the next task for host managed-node2 19285 1727203906.26691: done getting next task for host managed-node2 19285 1727203906.26694: ^ task is: TASK: Set current_interfaces 19285 1727203906.26698: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203906.26701: getting variables 19285 1727203906.26703: in VariableManager get_vars() 19285 1727203906.26736: Calling all_inventory to load vars for managed-node2 19285 1727203906.26739: Calling groups_inventory to load vars for managed-node2 19285 1727203906.26743: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203906.26755: Calling all_plugins_play to load vars for managed-node2 19285 1727203906.26758: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203906.26763: Calling groups_plugins_play to load vars for managed-node2 19285 1727203906.27339: WORKER PROCESS EXITING 19285 1727203906.27450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203906.28008: done with get_vars() 19285 1727203906.28019: done getting variables 19285 1727203906.28183: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:51:46 -0400 (0:00:00.650) 0:00:05.356 ***** 19285 1727203906.28225: entering _queue_task() for managed-node2/set_fact 19285 1727203906.28859: worker is 1 (out of 1 available) 19285 1727203906.28873: exiting _queue_task() for managed-node2/set_fact 19285 1727203906.28906: done queuing things up, now waiting for results queue to drain 19285 1727203906.28908: waiting for pending results... 19285 1727203906.29255: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 19285 1727203906.29449: in run() - task 028d2410-947f-f31b-fb3f-0000000000fe 19285 1727203906.29464: variable 'ansible_search_path' from source: unknown 19285 1727203906.29468: variable 'ansible_search_path' from source: unknown 19285 1727203906.29663: calling self._execute() 19285 1727203906.29803: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203906.29807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203906.29829: variable 'omit' from source: magic vars 19285 1727203906.30532: variable 'ansible_distribution_major_version' from source: facts 19285 1727203906.30537: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203906.30540: variable 'omit' from source: magic vars 19285 1727203906.30862: variable 'omit' from source: magic vars 19285 1727203906.30913: variable '_current_interfaces' from source: set_fact 19285 1727203906.30980: variable 'omit' from source: magic vars 19285 1727203906.31125: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203906.31156: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203906.31181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203906.31304: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203906.31349: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203906.31352: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203906.31354: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203906.31357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203906.31580: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203906.31689: Set connection var ansible_pipelining to False 19285 1727203906.31692: Set connection var ansible_timeout to 10 19285 1727203906.31694: Set connection var ansible_shell_type to sh 19285 1727203906.31696: Set connection var ansible_shell_executable to /bin/sh 19285 1727203906.31698: Set connection var ansible_connection to ssh 19285 1727203906.31781: variable 'ansible_shell_executable' from source: unknown 19285 1727203906.31784: variable 'ansible_connection' from source: unknown 19285 1727203906.31787: variable 'ansible_module_compression' from source: unknown 19285 1727203906.31789: variable 'ansible_shell_type' from source: unknown 19285 1727203906.31981: variable 'ansible_shell_executable' from source: unknown 19285 1727203906.31984: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203906.31986: variable 'ansible_pipelining' from source: unknown 19285 1727203906.31988: variable 'ansible_timeout' from source: unknown 19285 1727203906.31990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203906.32135: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203906.32197: variable 'omit' from source: magic vars 19285 1727203906.32389: starting attempt loop 19285 1727203906.32393: running the handler 19285 1727203906.32395: handler run complete 19285 1727203906.32398: attempt loop complete, returning result 19285 1727203906.32404: _execute() done 19285 1727203906.32407: dumping result to json 19285 1727203906.32410: done dumping result, returning 19285 1727203906.32412: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [028d2410-947f-f31b-fb3f-0000000000fe] 19285 1727203906.32414: sending task result for task 028d2410-947f-f31b-fb3f-0000000000fe ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 19285 1727203906.32556: no more pending results, returning what we have 19285 1727203906.32563: results queue empty 19285 1727203906.32564: checking for any_errors_fatal 19285 1727203906.32577: done checking for any_errors_fatal 19285 1727203906.32578: checking for max_fail_percentage 19285 1727203906.32580: done checking for max_fail_percentage 19285 1727203906.32581: checking to see if all hosts have failed and the running result is not ok 19285 1727203906.32582: done checking to see if all hosts have failed 19285 1727203906.32582: getting the remaining hosts for this loop 19285 1727203906.32584: done getting the remaining hosts for this loop 19285 1727203906.32588: getting the next task for host managed-node2 19285 1727203906.32597: done getting next task for host managed-node2 19285 1727203906.32599: ^ task is: TASK: Show current_interfaces 19285 1727203906.32603: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203906.32607: getting variables 19285 1727203906.32609: in VariableManager get_vars() 19285 1727203906.32756: Calling all_inventory to load vars for managed-node2 19285 1727203906.32761: Calling groups_inventory to load vars for managed-node2 19285 1727203906.32766: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203906.32854: Calling all_plugins_play to load vars for managed-node2 19285 1727203906.32858: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203906.32866: done sending task result for task 028d2410-947f-f31b-fb3f-0000000000fe 19285 1727203906.32868: WORKER PROCESS EXITING 19285 1727203906.32873: Calling groups_plugins_play to load vars for managed-node2 19285 1727203906.33168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203906.33384: done with get_vars() 19285 1727203906.33398: done getting variables 19285 1727203906.33498: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:51:46 -0400 (0:00:00.052) 0:00:05.409 ***** 19285 1727203906.33527: entering _queue_task() for managed-node2/debug 19285 1727203906.33529: Creating lock for debug 19285 1727203906.33858: worker is 1 (out of 1 available) 19285 1727203906.33870: exiting _queue_task() for managed-node2/debug 19285 1727203906.33885: done queuing things up, now waiting for results queue to drain 19285 1727203906.33886: waiting for pending results... 19285 1727203906.34125: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 19285 1727203906.34268: in run() - task 028d2410-947f-f31b-fb3f-0000000000ef 19285 1727203906.34299: variable 'ansible_search_path' from source: unknown 19285 1727203906.34308: variable 'ansible_search_path' from source: unknown 19285 1727203906.34373: calling self._execute() 19285 1727203906.34469: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203906.34551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203906.34591: variable 'omit' from source: magic vars 19285 1727203906.34894: variable 'ansible_distribution_major_version' from source: facts 19285 1727203906.34916: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203906.34983: variable 'omit' from source: magic vars 19285 1727203906.34986: variable 'omit' from source: magic vars 19285 1727203906.35080: variable 'current_interfaces' from source: set_fact 19285 1727203906.35115: variable 'omit' from source: magic vars 19285 1727203906.35170: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203906.35213: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203906.35243: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203906.35268: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203906.35290: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203906.35320: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203906.35350: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203906.35353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203906.35443: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203906.35569: Set connection var ansible_pipelining to False 19285 1727203906.35573: Set connection var ansible_timeout to 10 19285 1727203906.35577: Set connection var ansible_shell_type to sh 19285 1727203906.35579: Set connection var ansible_shell_executable to /bin/sh 19285 1727203906.35581: Set connection var ansible_connection to ssh 19285 1727203906.35583: variable 'ansible_shell_executable' from source: unknown 19285 1727203906.35585: variable 'ansible_connection' from source: unknown 19285 1727203906.35587: variable 'ansible_module_compression' from source: unknown 19285 1727203906.35589: variable 'ansible_shell_type' from source: unknown 19285 1727203906.35591: variable 'ansible_shell_executable' from source: unknown 19285 1727203906.35593: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203906.35595: variable 'ansible_pipelining' from source: unknown 19285 1727203906.35597: variable 'ansible_timeout' from source: unknown 19285 1727203906.35599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203906.35714: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203906.35722: variable 'omit' from source: magic vars 19285 1727203906.35726: starting attempt loop 19285 1727203906.35728: running the handler 19285 1727203906.35787: handler run complete 19285 1727203906.35797: attempt loop complete, returning result 19285 1727203906.35799: _execute() done 19285 1727203906.35802: dumping result to json 19285 1727203906.35804: done dumping result, returning 19285 1727203906.35811: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [028d2410-947f-f31b-fb3f-0000000000ef] 19285 1727203906.35815: sending task result for task 028d2410-947f-f31b-fb3f-0000000000ef ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 19285 1727203906.35948: no more pending results, returning what we have 19285 1727203906.35951: results queue empty 19285 1727203906.35952: checking for any_errors_fatal 19285 1727203906.35957: done checking for any_errors_fatal 19285 1727203906.35957: checking for max_fail_percentage 19285 1727203906.35959: done checking for max_fail_percentage 19285 1727203906.35960: checking to see if all hosts have failed and the running result is not ok 19285 1727203906.35961: done checking to see if all hosts have failed 19285 1727203906.35961: getting the remaining hosts for this loop 19285 1727203906.35963: done getting the remaining hosts for this loop 19285 1727203906.35966: getting the next task for host managed-node2 19285 1727203906.35974: done getting next task for host managed-node2 19285 1727203906.35979: ^ task is: TASK: Include the task 'assert_device_absent.yml' 19285 1727203906.35981: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203906.35985: getting variables 19285 1727203906.35986: in VariableManager get_vars() 19285 1727203906.36015: Calling all_inventory to load vars for managed-node2 19285 1727203906.36018: Calling groups_inventory to load vars for managed-node2 19285 1727203906.36021: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203906.36031: Calling all_plugins_play to load vars for managed-node2 19285 1727203906.36033: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203906.36035: Calling groups_plugins_play to load vars for managed-node2 19285 1727203906.36214: done sending task result for task 028d2410-947f-f31b-fb3f-0000000000ef 19285 1727203906.36218: WORKER PROCESS EXITING 19285 1727203906.36230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203906.36347: done with get_vars() 19285 1727203906.36354: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:14 Tuesday 24 September 2024 14:51:46 -0400 (0:00:00.028) 0:00:05.438 ***** 19285 1727203906.36420: entering _queue_task() for managed-node2/include_tasks 19285 1727203906.36618: worker is 1 (out of 1 available) 19285 1727203906.36630: exiting _queue_task() for managed-node2/include_tasks 19285 1727203906.36641: done queuing things up, now waiting for results queue to drain 19285 1727203906.36643: waiting for pending results... 19285 1727203906.36789: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_device_absent.yml' 19285 1727203906.36841: in run() - task 028d2410-947f-f31b-fb3f-00000000000d 19285 1727203906.36852: variable 'ansible_search_path' from source: unknown 19285 1727203906.36911: calling self._execute() 19285 1727203906.36955: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203906.36958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203906.36967: variable 'omit' from source: magic vars 19285 1727203906.37346: variable 'ansible_distribution_major_version' from source: facts 19285 1727203906.37367: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203906.37381: _execute() done 19285 1727203906.37389: dumping result to json 19285 1727203906.37439: done dumping result, returning 19285 1727203906.37443: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_device_absent.yml' [028d2410-947f-f31b-fb3f-00000000000d] 19285 1727203906.37446: sending task result for task 028d2410-947f-f31b-fb3f-00000000000d 19285 1727203906.37608: no more pending results, returning what we have 19285 1727203906.37613: in VariableManager get_vars() 19285 1727203906.37692: Calling all_inventory to load vars for managed-node2 19285 1727203906.37695: Calling groups_inventory to load vars for managed-node2 19285 1727203906.37700: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203906.37714: Calling all_plugins_play to load vars for managed-node2 19285 1727203906.37717: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203906.37720: Calling groups_plugins_play to load vars for managed-node2 19285 1727203906.38102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203906.38492: done with get_vars() 19285 1727203906.38500: variable 'ansible_search_path' from source: unknown 19285 1727203906.38514: we have included files to process 19285 1727203906.38515: generating all_blocks data 19285 1727203906.38517: done generating all_blocks data 19285 1727203906.38525: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 19285 1727203906.38527: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 19285 1727203906.38531: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 19285 1727203906.38542: done sending task result for task 028d2410-947f-f31b-fb3f-00000000000d 19285 1727203906.38545: WORKER PROCESS EXITING 19285 1727203906.38729: in VariableManager get_vars() 19285 1727203906.38754: done with get_vars() 19285 1727203906.38904: done processing included file 19285 1727203906.38906: iterating over new_blocks loaded from include file 19285 1727203906.38910: in VariableManager get_vars() 19285 1727203906.38924: done with get_vars() 19285 1727203906.38926: filtering new block on tags 19285 1727203906.38949: done filtering new block on tags 19285 1727203906.38951: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node2 19285 1727203906.38956: extending task lists for all hosts with included blocks 19285 1727203906.39135: done extending task lists 19285 1727203906.39136: done processing included files 19285 1727203906.39137: results queue empty 19285 1727203906.39138: checking for any_errors_fatal 19285 1727203906.39142: done checking for any_errors_fatal 19285 1727203906.39143: checking for max_fail_percentage 19285 1727203906.39144: done checking for max_fail_percentage 19285 1727203906.39149: checking to see if all hosts have failed and the running result is not ok 19285 1727203906.39150: done checking to see if all hosts have failed 19285 1727203906.39150: getting the remaining hosts for this loop 19285 1727203906.39152: done getting the remaining hosts for this loop 19285 1727203906.39154: getting the next task for host managed-node2 19285 1727203906.39160: done getting next task for host managed-node2 19285 1727203906.39163: ^ task is: TASK: Include the task 'get_interface_stat.yml' 19285 1727203906.39166: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203906.39168: getting variables 19285 1727203906.39169: in VariableManager get_vars() 19285 1727203906.39179: Calling all_inventory to load vars for managed-node2 19285 1727203906.39181: Calling groups_inventory to load vars for managed-node2 19285 1727203906.39183: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203906.39188: Calling all_plugins_play to load vars for managed-node2 19285 1727203906.39191: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203906.39193: Calling groups_plugins_play to load vars for managed-node2 19285 1727203906.39397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203906.39598: done with get_vars() 19285 1727203906.39607: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 14:51:46 -0400 (0:00:00.032) 0:00:05.471 ***** 19285 1727203906.39676: entering _queue_task() for managed-node2/include_tasks 19285 1727203906.39947: worker is 1 (out of 1 available) 19285 1727203906.39962: exiting _queue_task() for managed-node2/include_tasks 19285 1727203906.39978: done queuing things up, now waiting for results queue to drain 19285 1727203906.39980: waiting for pending results... 19285 1727203906.40233: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 19285 1727203906.40315: in run() - task 028d2410-947f-f31b-fb3f-000000000119 19285 1727203906.40380: variable 'ansible_search_path' from source: unknown 19285 1727203906.40384: variable 'ansible_search_path' from source: unknown 19285 1727203906.40390: calling self._execute() 19285 1727203906.40478: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203906.40489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203906.40503: variable 'omit' from source: magic vars 19285 1727203906.40881: variable 'ansible_distribution_major_version' from source: facts 19285 1727203906.40978: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203906.40981: _execute() done 19285 1727203906.40985: dumping result to json 19285 1727203906.40987: done dumping result, returning 19285 1727203906.40989: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [028d2410-947f-f31b-fb3f-000000000119] 19285 1727203906.40991: sending task result for task 028d2410-947f-f31b-fb3f-000000000119 19285 1727203906.41056: done sending task result for task 028d2410-947f-f31b-fb3f-000000000119 19285 1727203906.41061: WORKER PROCESS EXITING 19285 1727203906.41102: no more pending results, returning what we have 19285 1727203906.41108: in VariableManager get_vars() 19285 1727203906.41143: Calling all_inventory to load vars for managed-node2 19285 1727203906.41147: Calling groups_inventory to load vars for managed-node2 19285 1727203906.41151: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203906.41168: Calling all_plugins_play to load vars for managed-node2 19285 1727203906.41454: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203906.41462: Calling groups_plugins_play to load vars for managed-node2 19285 1727203906.41639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203906.41842: done with get_vars() 19285 1727203906.41849: variable 'ansible_search_path' from source: unknown 19285 1727203906.41850: variable 'ansible_search_path' from source: unknown 19285 1727203906.41888: we have included files to process 19285 1727203906.41889: generating all_blocks data 19285 1727203906.41891: done generating all_blocks data 19285 1727203906.41892: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 19285 1727203906.41894: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 19285 1727203906.41896: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 19285 1727203906.42149: done processing included file 19285 1727203906.42151: iterating over new_blocks loaded from include file 19285 1727203906.42153: in VariableManager get_vars() 19285 1727203906.42168: done with get_vars() 19285 1727203906.42169: filtering new block on tags 19285 1727203906.42187: done filtering new block on tags 19285 1727203906.42189: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 19285 1727203906.42194: extending task lists for all hosts with included blocks 19285 1727203906.42343: done extending task lists 19285 1727203906.42349: done processing included files 19285 1727203906.42349: results queue empty 19285 1727203906.42350: checking for any_errors_fatal 19285 1727203906.42353: done checking for any_errors_fatal 19285 1727203906.42353: checking for max_fail_percentage 19285 1727203906.42354: done checking for max_fail_percentage 19285 1727203906.42355: checking to see if all hosts have failed and the running result is not ok 19285 1727203906.42356: done checking to see if all hosts have failed 19285 1727203906.42357: getting the remaining hosts for this loop 19285 1727203906.42358: done getting the remaining hosts for this loop 19285 1727203906.42363: getting the next task for host managed-node2 19285 1727203906.42367: done getting next task for host managed-node2 19285 1727203906.42369: ^ task is: TASK: Get stat for interface {{ interface }} 19285 1727203906.42371: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203906.42374: getting variables 19285 1727203906.42374: in VariableManager get_vars() 19285 1727203906.42384: Calling all_inventory to load vars for managed-node2 19285 1727203906.42386: Calling groups_inventory to load vars for managed-node2 19285 1727203906.42388: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203906.42392: Calling all_plugins_play to load vars for managed-node2 19285 1727203906.42395: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203906.42397: Calling groups_plugins_play to load vars for managed-node2 19285 1727203906.42538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203906.42728: done with get_vars() 19285 1727203906.42737: done getting variables 19285 1727203906.42911: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:51:46 -0400 (0:00:00.032) 0:00:05.503 ***** 19285 1727203906.42940: entering _queue_task() for managed-node2/stat 19285 1727203906.43390: worker is 1 (out of 1 available) 19285 1727203906.43400: exiting _queue_task() for managed-node2/stat 19285 1727203906.43409: done queuing things up, now waiting for results queue to drain 19285 1727203906.43410: waiting for pending results... 19285 1727203906.43708: running TaskExecutor() for managed-node2/TASK: Get stat for interface LSR-TST-br31 19285 1727203906.43713: in run() - task 028d2410-947f-f31b-fb3f-000000000133 19285 1727203906.43715: variable 'ansible_search_path' from source: unknown 19285 1727203906.43718: variable 'ansible_search_path' from source: unknown 19285 1727203906.43720: calling self._execute() 19285 1727203906.43770: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203906.43783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203906.43806: variable 'omit' from source: magic vars 19285 1727203906.44169: variable 'ansible_distribution_major_version' from source: facts 19285 1727203906.44189: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203906.44200: variable 'omit' from source: magic vars 19285 1727203906.44254: variable 'omit' from source: magic vars 19285 1727203906.44356: variable 'interface' from source: set_fact 19285 1727203906.44464: variable 'omit' from source: magic vars 19285 1727203906.44467: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203906.44470: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203906.44497: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203906.44520: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203906.44537: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203906.44583: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203906.44593: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203906.44602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203906.44717: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203906.44731: Set connection var ansible_pipelining to False 19285 1727203906.44742: Set connection var ansible_timeout to 10 19285 1727203906.44749: Set connection var ansible_shell_type to sh 19285 1727203906.44765: Set connection var ansible_shell_executable to /bin/sh 19285 1727203906.44772: Set connection var ansible_connection to ssh 19285 1727203906.44807: variable 'ansible_shell_executable' from source: unknown 19285 1727203906.44895: variable 'ansible_connection' from source: unknown 19285 1727203906.44900: variable 'ansible_module_compression' from source: unknown 19285 1727203906.44902: variable 'ansible_shell_type' from source: unknown 19285 1727203906.44904: variable 'ansible_shell_executable' from source: unknown 19285 1727203906.44906: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203906.44908: variable 'ansible_pipelining' from source: unknown 19285 1727203906.44910: variable 'ansible_timeout' from source: unknown 19285 1727203906.44912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203906.45112: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19285 1727203906.45117: variable 'omit' from source: magic vars 19285 1727203906.45119: starting attempt loop 19285 1727203906.45121: running the handler 19285 1727203906.45180: _low_level_execute_command(): starting 19285 1727203906.45183: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203906.46005: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203906.46067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203906.46096: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203906.46220: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203906.46320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203906.48028: stdout chunk (state=3): >>>/root <<< 19285 1727203906.48212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203906.48216: stdout chunk (state=3): >>><<< 19285 1727203906.48218: stderr chunk (state=3): >>><<< 19285 1727203906.48436: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203906.48440: _low_level_execute_command(): starting 19285 1727203906.48443: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203906.4833648-19755-126367977159654 `" && echo ansible-tmp-1727203906.4833648-19755-126367977159654="` echo /root/.ansible/tmp/ansible-tmp-1727203906.4833648-19755-126367977159654 `" ) && sleep 0' 19285 1727203906.49524: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203906.49533: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203906.49545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203906.49561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203906.49626: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203906.49952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203906.49984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203906.50084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203906.52039: stdout chunk (state=3): >>>ansible-tmp-1727203906.4833648-19755-126367977159654=/root/.ansible/tmp/ansible-tmp-1727203906.4833648-19755-126367977159654 <<< 19285 1727203906.52199: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203906.52202: stdout chunk (state=3): >>><<< 19285 1727203906.52205: stderr chunk (state=3): >>><<< 19285 1727203906.52243: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203906.4833648-19755-126367977159654=/root/.ansible/tmp/ansible-tmp-1727203906.4833648-19755-126367977159654 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203906.52285: variable 'ansible_module_compression' from source: unknown 19285 1727203906.52521: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 19285 1727203906.52525: variable 'ansible_facts' from source: unknown 19285 1727203906.52665: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203906.4833648-19755-126367977159654/AnsiballZ_stat.py 19285 1727203906.53146: Sending initial data 19285 1727203906.53149: Sent initial data (153 bytes) 19285 1727203906.54303: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203906.54420: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203906.54482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203906.54594: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203906.56245: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203906.56319: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203906.56398: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmpjj48mm0f /root/.ansible/tmp/ansible-tmp-1727203906.4833648-19755-126367977159654/AnsiballZ_stat.py <<< 19285 1727203906.56402: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203906.4833648-19755-126367977159654/AnsiballZ_stat.py" <<< 19285 1727203906.56503: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmpjj48mm0f" to remote "/root/.ansible/tmp/ansible-tmp-1727203906.4833648-19755-126367977159654/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203906.4833648-19755-126367977159654/AnsiballZ_stat.py" <<< 19285 1727203906.57931: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203906.58007: stderr chunk (state=3): >>><<< 19285 1727203906.58015: stdout chunk (state=3): >>><<< 19285 1727203906.58099: done transferring module to remote 19285 1727203906.58114: _low_level_execute_command(): starting 19285 1727203906.58123: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203906.4833648-19755-126367977159654/ /root/.ansible/tmp/ansible-tmp-1727203906.4833648-19755-126367977159654/AnsiballZ_stat.py && sleep 0' 19285 1727203906.59221: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203906.59235: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203906.59309: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203906.59338: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203906.59391: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203906.59512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203906.61391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203906.61395: stdout chunk (state=3): >>><<< 19285 1727203906.61397: stderr chunk (state=3): >>><<< 19285 1727203906.61417: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203906.61482: _low_level_execute_command(): starting 19285 1727203906.61485: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203906.4833648-19755-126367977159654/AnsiballZ_stat.py && sleep 0' 19285 1727203906.62144: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203906.62197: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203906.62220: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203906.62234: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203906.62410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203906.77812: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 19285 1727203906.79273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203906.79362: stderr chunk (state=3): >>><<< 19285 1727203906.79366: stdout chunk (state=3): >>><<< 19285 1727203906.79506: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203906.79510: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203906.4833648-19755-126367977159654/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203906.79514: _low_level_execute_command(): starting 19285 1727203906.79516: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203906.4833648-19755-126367977159654/ > /dev/null 2>&1 && sleep 0' 19285 1727203906.80210: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203906.80225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 19285 1727203906.80235: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203906.80282: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203906.80306: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203906.80374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203906.82309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203906.82313: stdout chunk (state=3): >>><<< 19285 1727203906.82318: stderr chunk (state=3): >>><<< 19285 1727203906.82335: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203906.82483: handler run complete 19285 1727203906.82487: attempt loop complete, returning result 19285 1727203906.82489: _execute() done 19285 1727203906.82490: dumping result to json 19285 1727203906.82492: done dumping result, returning 19285 1727203906.82494: done running TaskExecutor() for managed-node2/TASK: Get stat for interface LSR-TST-br31 [028d2410-947f-f31b-fb3f-000000000133] 19285 1727203906.82500: sending task result for task 028d2410-947f-f31b-fb3f-000000000133 19285 1727203906.82573: done sending task result for task 028d2410-947f-f31b-fb3f-000000000133 19285 1727203906.82578: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 19285 1727203906.82652: no more pending results, returning what we have 19285 1727203906.82656: results queue empty 19285 1727203906.82657: checking for any_errors_fatal 19285 1727203906.82663: done checking for any_errors_fatal 19285 1727203906.82663: checking for max_fail_percentage 19285 1727203906.82665: done checking for max_fail_percentage 19285 1727203906.82666: checking to see if all hosts have failed and the running result is not ok 19285 1727203906.82667: done checking to see if all hosts have failed 19285 1727203906.82667: getting the remaining hosts for this loop 19285 1727203906.82669: done getting the remaining hosts for this loop 19285 1727203906.82673: getting the next task for host managed-node2 19285 1727203906.82762: done getting next task for host managed-node2 19285 1727203906.82765: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 19285 1727203906.82768: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203906.82772: getting variables 19285 1727203906.82774: in VariableManager get_vars() 19285 1727203906.82926: Calling all_inventory to load vars for managed-node2 19285 1727203906.82929: Calling groups_inventory to load vars for managed-node2 19285 1727203906.82938: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203906.82949: Calling all_plugins_play to load vars for managed-node2 19285 1727203906.82951: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203906.82954: Calling groups_plugins_play to load vars for managed-node2 19285 1727203906.83507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203906.83707: done with get_vars() 19285 1727203906.83717: done getting variables 19285 1727203906.83814: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 19285 1727203906.83928: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'LSR-TST-br31'] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 14:51:46 -0400 (0:00:00.410) 0:00:05.914 ***** 19285 1727203906.83956: entering _queue_task() for managed-node2/assert 19285 1727203906.83958: Creating lock for assert 19285 1727203906.84241: worker is 1 (out of 1 available) 19285 1727203906.84255: exiting _queue_task() for managed-node2/assert 19285 1727203906.84267: done queuing things up, now waiting for results queue to drain 19285 1727203906.84269: waiting for pending results... 19285 1727203906.84551: running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'LSR-TST-br31' 19285 1727203906.84615: in run() - task 028d2410-947f-f31b-fb3f-00000000011a 19285 1727203906.84619: variable 'ansible_search_path' from source: unknown 19285 1727203906.84622: variable 'ansible_search_path' from source: unknown 19285 1727203906.84662: calling self._execute() 19285 1727203906.84723: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203906.84726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203906.84735: variable 'omit' from source: magic vars 19285 1727203906.85015: variable 'ansible_distribution_major_version' from source: facts 19285 1727203906.85131: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203906.85135: variable 'omit' from source: magic vars 19285 1727203906.85138: variable 'omit' from source: magic vars 19285 1727203906.85141: variable 'interface' from source: set_fact 19285 1727203906.85143: variable 'omit' from source: magic vars 19285 1727203906.85175: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203906.85204: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203906.85220: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203906.85234: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203906.85244: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203906.85272: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203906.85277: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203906.85280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203906.85344: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203906.85347: Set connection var ansible_pipelining to False 19285 1727203906.85354: Set connection var ansible_timeout to 10 19285 1727203906.85356: Set connection var ansible_shell_type to sh 19285 1727203906.85369: Set connection var ansible_shell_executable to /bin/sh 19285 1727203906.85371: Set connection var ansible_connection to ssh 19285 1727203906.85388: variable 'ansible_shell_executable' from source: unknown 19285 1727203906.85391: variable 'ansible_connection' from source: unknown 19285 1727203906.85393: variable 'ansible_module_compression' from source: unknown 19285 1727203906.85396: variable 'ansible_shell_type' from source: unknown 19285 1727203906.85398: variable 'ansible_shell_executable' from source: unknown 19285 1727203906.85401: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203906.85403: variable 'ansible_pipelining' from source: unknown 19285 1727203906.85405: variable 'ansible_timeout' from source: unknown 19285 1727203906.85410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203906.85513: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203906.85520: variable 'omit' from source: magic vars 19285 1727203906.85526: starting attempt loop 19285 1727203906.85529: running the handler 19285 1727203906.85630: variable 'interface_stat' from source: set_fact 19285 1727203906.85638: Evaluated conditional (not interface_stat.stat.exists): True 19285 1727203906.85643: handler run complete 19285 1727203906.85654: attempt loop complete, returning result 19285 1727203906.85656: _execute() done 19285 1727203906.85659: dumping result to json 19285 1727203906.85663: done dumping result, returning 19285 1727203906.85670: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'LSR-TST-br31' [028d2410-947f-f31b-fb3f-00000000011a] 19285 1727203906.85677: sending task result for task 028d2410-947f-f31b-fb3f-00000000011a 19285 1727203906.85784: done sending task result for task 028d2410-947f-f31b-fb3f-00000000011a 19285 1727203906.85788: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 19285 1727203906.85833: no more pending results, returning what we have 19285 1727203906.85836: results queue empty 19285 1727203906.85837: checking for any_errors_fatal 19285 1727203906.85845: done checking for any_errors_fatal 19285 1727203906.85846: checking for max_fail_percentage 19285 1727203906.85847: done checking for max_fail_percentage 19285 1727203906.85848: checking to see if all hosts have failed and the running result is not ok 19285 1727203906.85849: done checking to see if all hosts have failed 19285 1727203906.85850: getting the remaining hosts for this loop 19285 1727203906.85852: done getting the remaining hosts for this loop 19285 1727203906.85855: getting the next task for host managed-node2 19285 1727203906.85862: done getting next task for host managed-node2 19285 1727203906.85865: ^ task is: TASK: meta (flush_handlers) 19285 1727203906.85866: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203906.85870: getting variables 19285 1727203906.85872: in VariableManager get_vars() 19285 1727203906.85897: Calling all_inventory to load vars for managed-node2 19285 1727203906.85899: Calling groups_inventory to load vars for managed-node2 19285 1727203906.85902: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203906.85910: Calling all_plugins_play to load vars for managed-node2 19285 1727203906.85913: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203906.85915: Calling groups_plugins_play to load vars for managed-node2 19285 1727203906.86039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203906.86154: done with get_vars() 19285 1727203906.86161: done getting variables 19285 1727203906.86210: in VariableManager get_vars() 19285 1727203906.86216: Calling all_inventory to load vars for managed-node2 19285 1727203906.86217: Calling groups_inventory to load vars for managed-node2 19285 1727203906.86219: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203906.86222: Calling all_plugins_play to load vars for managed-node2 19285 1727203906.86223: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203906.86224: Calling groups_plugins_play to load vars for managed-node2 19285 1727203906.86334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203906.86444: done with get_vars() 19285 1727203906.86452: done queuing things up, now waiting for results queue to drain 19285 1727203906.86454: results queue empty 19285 1727203906.86454: checking for any_errors_fatal 19285 1727203906.86456: done checking for any_errors_fatal 19285 1727203906.86456: checking for max_fail_percentage 19285 1727203906.86457: done checking for max_fail_percentage 19285 1727203906.86457: checking to see if all hosts have failed and the running result is not ok 19285 1727203906.86458: done checking to see if all hosts have failed 19285 1727203906.86462: getting the remaining hosts for this loop 19285 1727203906.86463: done getting the remaining hosts for this loop 19285 1727203906.86465: getting the next task for host managed-node2 19285 1727203906.86467: done getting next task for host managed-node2 19285 1727203906.86468: ^ task is: TASK: meta (flush_handlers) 19285 1727203906.86469: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203906.86471: getting variables 19285 1727203906.86471: in VariableManager get_vars() 19285 1727203906.86478: Calling all_inventory to load vars for managed-node2 19285 1727203906.86479: Calling groups_inventory to load vars for managed-node2 19285 1727203906.86481: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203906.86484: Calling all_plugins_play to load vars for managed-node2 19285 1727203906.86485: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203906.86487: Calling groups_plugins_play to load vars for managed-node2 19285 1727203906.86599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203906.86766: done with get_vars() 19285 1727203906.86773: done getting variables 19285 1727203906.86818: in VariableManager get_vars() 19285 1727203906.86826: Calling all_inventory to load vars for managed-node2 19285 1727203906.86828: Calling groups_inventory to load vars for managed-node2 19285 1727203906.86831: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203906.86835: Calling all_plugins_play to load vars for managed-node2 19285 1727203906.86837: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203906.86840: Calling groups_plugins_play to load vars for managed-node2 19285 1727203906.86989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203906.87186: done with get_vars() 19285 1727203906.87196: done queuing things up, now waiting for results queue to drain 19285 1727203906.87198: results queue empty 19285 1727203906.87199: checking for any_errors_fatal 19285 1727203906.87200: done checking for any_errors_fatal 19285 1727203906.87201: checking for max_fail_percentage 19285 1727203906.87202: done checking for max_fail_percentage 19285 1727203906.87202: checking to see if all hosts have failed and the running result is not ok 19285 1727203906.87203: done checking to see if all hosts have failed 19285 1727203906.87204: getting the remaining hosts for this loop 19285 1727203906.87204: done getting the remaining hosts for this loop 19285 1727203906.87207: getting the next task for host managed-node2 19285 1727203906.87209: done getting next task for host managed-node2 19285 1727203906.87210: ^ task is: None 19285 1727203906.87211: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203906.87212: done queuing things up, now waiting for results queue to drain 19285 1727203906.87213: results queue empty 19285 1727203906.87214: checking for any_errors_fatal 19285 1727203906.87215: done checking for any_errors_fatal 19285 1727203906.87215: checking for max_fail_percentage 19285 1727203906.87216: done checking for max_fail_percentage 19285 1727203906.87217: checking to see if all hosts have failed and the running result is not ok 19285 1727203906.87217: done checking to see if all hosts have failed 19285 1727203906.87219: getting the next task for host managed-node2 19285 1727203906.87221: done getting next task for host managed-node2 19285 1727203906.87222: ^ task is: None 19285 1727203906.87223: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203906.87262: in VariableManager get_vars() 19285 1727203906.87284: done with get_vars() 19285 1727203906.87290: in VariableManager get_vars() 19285 1727203906.87317: done with get_vars() 19285 1727203906.87323: variable 'omit' from source: magic vars 19285 1727203906.87353: in VariableManager get_vars() 19285 1727203906.87366: done with get_vars() 19285 1727203906.87387: variable 'omit' from source: magic vars PLAY [Add test bridge] ********************************************************* 19285 1727203906.88243: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19285 1727203906.88277: getting the remaining hosts for this loop 19285 1727203906.88278: done getting the remaining hosts for this loop 19285 1727203906.88281: getting the next task for host managed-node2 19285 1727203906.88283: done getting next task for host managed-node2 19285 1727203906.88285: ^ task is: TASK: Gathering Facts 19285 1727203906.88287: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203906.88289: getting variables 19285 1727203906.88290: in VariableManager get_vars() 19285 1727203906.88301: Calling all_inventory to load vars for managed-node2 19285 1727203906.88321: Calling groups_inventory to load vars for managed-node2 19285 1727203906.88324: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203906.88330: Calling all_plugins_play to load vars for managed-node2 19285 1727203906.88337: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203906.88341: Calling groups_plugins_play to load vars for managed-node2 19285 1727203906.88495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203906.88736: done with get_vars() 19285 1727203906.88753: done getting variables 19285 1727203906.88805: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:17 Tuesday 24 September 2024 14:51:46 -0400 (0:00:00.048) 0:00:05.962 ***** 19285 1727203906.88829: entering _queue_task() for managed-node2/gather_facts 19285 1727203906.89172: worker is 1 (out of 1 available) 19285 1727203906.89298: exiting _queue_task() for managed-node2/gather_facts 19285 1727203906.89309: done queuing things up, now waiting for results queue to drain 19285 1727203906.89310: waiting for pending results... 19285 1727203906.89523: running TaskExecutor() for managed-node2/TASK: Gathering Facts 19285 1727203906.89600: in run() - task 028d2410-947f-f31b-fb3f-00000000014c 19285 1727203906.89664: variable 'ansible_search_path' from source: unknown 19285 1727203906.89763: calling self._execute() 19285 1727203906.89872: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203906.89948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203906.89951: variable 'omit' from source: magic vars 19285 1727203906.90365: variable 'ansible_distribution_major_version' from source: facts 19285 1727203906.90391: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203906.90403: variable 'omit' from source: magic vars 19285 1727203906.90501: variable 'omit' from source: magic vars 19285 1727203906.90583: variable 'omit' from source: magic vars 19285 1727203906.90780: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203906.90882: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203906.90886: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203906.90889: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203906.90892: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203906.90895: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203906.90898: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203906.90901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203906.90984: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203906.91005: Set connection var ansible_pipelining to False 19285 1727203906.91018: Set connection var ansible_timeout to 10 19285 1727203906.91028: Set connection var ansible_shell_type to sh 19285 1727203906.91250: Set connection var ansible_shell_executable to /bin/sh 19285 1727203906.91256: Set connection var ansible_connection to ssh 19285 1727203906.91259: variable 'ansible_shell_executable' from source: unknown 19285 1727203906.91261: variable 'ansible_connection' from source: unknown 19285 1727203906.91264: variable 'ansible_module_compression' from source: unknown 19285 1727203906.91266: variable 'ansible_shell_type' from source: unknown 19285 1727203906.91268: variable 'ansible_shell_executable' from source: unknown 19285 1727203906.91270: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203906.91272: variable 'ansible_pipelining' from source: unknown 19285 1727203906.91274: variable 'ansible_timeout' from source: unknown 19285 1727203906.91278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203906.91306: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203906.91354: variable 'omit' from source: magic vars 19285 1727203906.91358: starting attempt loop 19285 1727203906.91364: running the handler 19285 1727203906.91367: variable 'ansible_facts' from source: unknown 19285 1727203906.91423: _low_level_execute_command(): starting 19285 1727203906.91428: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203906.92533: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203906.92552: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203906.92606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203906.92710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203906.94419: stdout chunk (state=3): >>>/root <<< 19285 1727203906.94649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203906.94653: stdout chunk (state=3): >>><<< 19285 1727203906.94657: stderr chunk (state=3): >>><<< 19285 1727203906.94685: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203906.94706: _low_level_execute_command(): starting 19285 1727203906.94801: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203906.9469192-19834-73439331523288 `" && echo ansible-tmp-1727203906.9469192-19834-73439331523288="` echo /root/.ansible/tmp/ansible-tmp-1727203906.9469192-19834-73439331523288 `" ) && sleep 0' 19285 1727203906.95393: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203906.95416: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203906.95431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203906.95453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203906.95549: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203906.95563: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203906.95627: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203906.95792: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203906.96016: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203906.97954: stdout chunk (state=3): >>>ansible-tmp-1727203906.9469192-19834-73439331523288=/root/.ansible/tmp/ansible-tmp-1727203906.9469192-19834-73439331523288 <<< 19285 1727203906.98087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203906.98116: stderr chunk (state=3): >>><<< 19285 1727203906.98130: stdout chunk (state=3): >>><<< 19285 1727203906.98146: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203906.9469192-19834-73439331523288=/root/.ansible/tmp/ansible-tmp-1727203906.9469192-19834-73439331523288 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203906.98178: variable 'ansible_module_compression' from source: unknown 19285 1727203906.98224: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19285 1727203906.98277: variable 'ansible_facts' from source: unknown 19285 1727203906.98405: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203906.9469192-19834-73439331523288/AnsiballZ_setup.py 19285 1727203906.98515: Sending initial data 19285 1727203906.98518: Sent initial data (153 bytes) 19285 1727203906.99026: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203906.99029: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203906.99109: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203906.99168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203907.00738: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19285 1727203907.00743: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203907.00804: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203907.00878: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmpcm5j2fqu /root/.ansible/tmp/ansible-tmp-1727203906.9469192-19834-73439331523288/AnsiballZ_setup.py <<< 19285 1727203907.00884: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203906.9469192-19834-73439331523288/AnsiballZ_setup.py" <<< 19285 1727203907.00949: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmpcm5j2fqu" to remote "/root/.ansible/tmp/ansible-tmp-1727203906.9469192-19834-73439331523288/AnsiballZ_setup.py" <<< 19285 1727203907.00953: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203906.9469192-19834-73439331523288/AnsiballZ_setup.py" <<< 19285 1727203907.02128: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203907.02171: stderr chunk (state=3): >>><<< 19285 1727203907.02174: stdout chunk (state=3): >>><<< 19285 1727203907.02192: done transferring module to remote 19285 1727203907.02201: _low_level_execute_command(): starting 19285 1727203907.02205: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203906.9469192-19834-73439331523288/ /root/.ansible/tmp/ansible-tmp-1727203906.9469192-19834-73439331523288/AnsiballZ_setup.py && sleep 0' 19285 1727203907.02639: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203907.02643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 19285 1727203907.02645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203907.02647: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203907.02650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203907.02698: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203907.02707: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203907.02777: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203907.04671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203907.04679: stdout chunk (state=3): >>><<< 19285 1727203907.04681: stderr chunk (state=3): >>><<< 19285 1727203907.04773: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203907.04778: _low_level_execute_command(): starting 19285 1727203907.04781: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203906.9469192-19834-73439331523288/AnsiballZ_setup.py && sleep 0' 19285 1727203907.05421: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 19285 1727203907.05424: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203907.05494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203907.05510: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203907.05515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203907.05590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203907.67745: stdout chunk (state=3): >>> <<< 19285 1727203907.67793: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.46923828125, "5m": 0.3798828125, "15m": 0.19140625}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2912, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 619, "free": 2912}, "nocache": {"free": 3268, "used": 263}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 493, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261788237824, "block_size": 4096, "block_total": 65519099, "block_available": 63913144, "block_used": 1605955, "inode_total": 131070960, "inode_available": 131027264, "inode_used": 43696, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "51", "second": "47", "epoch": "1727203907", "epoch_int": "1727203907", "date": "2024-09-24", "time": "14:51:47", "iso8601_micro": "2024-09-24T18:51:47.638607Z", "iso8601": "2024-09-24T18:51:47Z", "iso8601_basic": "20240924T145147638607", "iso8601_basic_short": "20240924T145147", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_lsb": {}, "ansible_local": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19285 1727203907.69865: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203907.69869: stdout chunk (state=3): >>><<< 19285 1727203907.69872: stderr chunk (state=3): >>><<< 19285 1727203907.70219: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.46923828125, "5m": 0.3798828125, "15m": 0.19140625}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2912, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 619, "free": 2912}, "nocache": {"free": 3268, "used": 263}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 493, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261788237824, "block_size": 4096, "block_total": 65519099, "block_available": 63913144, "block_used": 1605955, "inode_total": 131070960, "inode_available": 131027264, "inode_used": 43696, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "51", "second": "47", "epoch": "1727203907", "epoch_int": "1727203907", "date": "2024-09-24", "time": "14:51:47", "iso8601_micro": "2024-09-24T18:51:47.638607Z", "iso8601": "2024-09-24T18:51:47Z", "iso8601_basic": "20240924T145147638607", "iso8601_basic_short": "20240924T145147", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_lsb": {}, "ansible_local": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203907.70671: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203906.9469192-19834-73439331523288/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203907.70972: _low_level_execute_command(): starting 19285 1727203907.70977: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203906.9469192-19834-73439331523288/ > /dev/null 2>&1 && sleep 0' 19285 1727203907.71740: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203907.71792: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203907.71861: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203907.71880: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203907.71902: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203907.72048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203907.73966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203907.73990: stdout chunk (state=3): >>><<< 19285 1727203907.74005: stderr chunk (state=3): >>><<< 19285 1727203907.74039: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203907.74074: handler run complete 19285 1727203907.74232: variable 'ansible_facts' from source: unknown 19285 1727203907.74539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203907.75328: variable 'ansible_facts' from source: unknown 19285 1727203907.75393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203907.75700: attempt loop complete, returning result 19285 1727203907.75717: _execute() done 19285 1727203907.75737: dumping result to json 19285 1727203907.75794: done dumping result, returning 19285 1727203907.75878: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [028d2410-947f-f31b-fb3f-00000000014c] 19285 1727203907.75972: sending task result for task 028d2410-947f-f31b-fb3f-00000000014c ok: [managed-node2] 19285 1727203907.76940: no more pending results, returning what we have 19285 1727203907.76942: results queue empty 19285 1727203907.76943: checking for any_errors_fatal 19285 1727203907.76944: done checking for any_errors_fatal 19285 1727203907.76945: checking for max_fail_percentage 19285 1727203907.76946: done checking for max_fail_percentage 19285 1727203907.76947: checking to see if all hosts have failed and the running result is not ok 19285 1727203907.76948: done checking to see if all hosts have failed 19285 1727203907.76948: getting the remaining hosts for this loop 19285 1727203907.76949: done getting the remaining hosts for this loop 19285 1727203907.76952: getting the next task for host managed-node2 19285 1727203907.76957: done getting next task for host managed-node2 19285 1727203907.76961: ^ task is: TASK: meta (flush_handlers) 19285 1727203907.76962: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203907.76965: getting variables 19285 1727203907.76966: in VariableManager get_vars() 19285 1727203907.77036: Calling all_inventory to load vars for managed-node2 19285 1727203907.77039: Calling groups_inventory to load vars for managed-node2 19285 1727203907.77041: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203907.77047: done sending task result for task 028d2410-947f-f31b-fb3f-00000000014c 19285 1727203907.77049: WORKER PROCESS EXITING 19285 1727203907.77060: Calling all_plugins_play to load vars for managed-node2 19285 1727203907.77062: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203907.77065: Calling groups_plugins_play to load vars for managed-node2 19285 1727203907.77270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203907.77479: done with get_vars() 19285 1727203907.77489: done getting variables 19285 1727203907.77561: in VariableManager get_vars() 19285 1727203907.77579: Calling all_inventory to load vars for managed-node2 19285 1727203907.77582: Calling groups_inventory to load vars for managed-node2 19285 1727203907.77584: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203907.77588: Calling all_plugins_play to load vars for managed-node2 19285 1727203907.77590: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203907.77592: Calling groups_plugins_play to load vars for managed-node2 19285 1727203907.77749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203907.77940: done with get_vars() 19285 1727203907.77952: done queuing things up, now waiting for results queue to drain 19285 1727203907.77954: results queue empty 19285 1727203907.77955: checking for any_errors_fatal 19285 1727203907.77957: done checking for any_errors_fatal 19285 1727203907.77961: checking for max_fail_percentage 19285 1727203907.77962: done checking for max_fail_percentage 19285 1727203907.77963: checking to see if all hosts have failed and the running result is not ok 19285 1727203907.77968: done checking to see if all hosts have failed 19285 1727203907.77969: getting the remaining hosts for this loop 19285 1727203907.77970: done getting the remaining hosts for this loop 19285 1727203907.77972: getting the next task for host managed-node2 19285 1727203907.77981: done getting next task for host managed-node2 19285 1727203907.77984: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 19285 1727203907.77985: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203907.77994: getting variables 19285 1727203907.77995: in VariableManager get_vars() 19285 1727203907.78012: Calling all_inventory to load vars for managed-node2 19285 1727203907.78014: Calling groups_inventory to load vars for managed-node2 19285 1727203907.78016: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203907.78020: Calling all_plugins_play to load vars for managed-node2 19285 1727203907.78022: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203907.78024: Calling groups_plugins_play to load vars for managed-node2 19285 1727203907.78155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203907.78346: done with get_vars() 19285 1727203907.78354: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:51:47 -0400 (0:00:00.895) 0:00:06.858 ***** 19285 1727203907.78425: entering _queue_task() for managed-node2/include_tasks 19285 1727203907.78719: worker is 1 (out of 1 available) 19285 1727203907.78730: exiting _queue_task() for managed-node2/include_tasks 19285 1727203907.78741: done queuing things up, now waiting for results queue to drain 19285 1727203907.78742: waiting for pending results... 19285 1727203907.79009: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 19285 1727203907.79110: in run() - task 028d2410-947f-f31b-fb3f-000000000014 19285 1727203907.79129: variable 'ansible_search_path' from source: unknown 19285 1727203907.79135: variable 'ansible_search_path' from source: unknown 19285 1727203907.79173: calling self._execute() 19285 1727203907.79338: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203907.79372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203907.79588: variable 'omit' from source: magic vars 19285 1727203907.79910: variable 'ansible_distribution_major_version' from source: facts 19285 1727203907.79930: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203907.79940: _execute() done 19285 1727203907.79946: dumping result to json 19285 1727203907.79952: done dumping result, returning 19285 1727203907.79966: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [028d2410-947f-f31b-fb3f-000000000014] 19285 1727203907.79978: sending task result for task 028d2410-947f-f31b-fb3f-000000000014 19285 1727203907.80183: done sending task result for task 028d2410-947f-f31b-fb3f-000000000014 19285 1727203907.80187: WORKER PROCESS EXITING 19285 1727203907.80225: no more pending results, returning what we have 19285 1727203907.80230: in VariableManager get_vars() 19285 1727203907.80379: Calling all_inventory to load vars for managed-node2 19285 1727203907.80383: Calling groups_inventory to load vars for managed-node2 19285 1727203907.80386: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203907.80395: Calling all_plugins_play to load vars for managed-node2 19285 1727203907.80398: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203907.80400: Calling groups_plugins_play to load vars for managed-node2 19285 1727203907.80665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203907.80942: done with get_vars() 19285 1727203907.80951: variable 'ansible_search_path' from source: unknown 19285 1727203907.80952: variable 'ansible_search_path' from source: unknown 19285 1727203907.80985: we have included files to process 19285 1727203907.80987: generating all_blocks data 19285 1727203907.80988: done generating all_blocks data 19285 1727203907.80989: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 19285 1727203907.80990: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 19285 1727203907.80993: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 19285 1727203907.82334: done processing included file 19285 1727203907.82337: iterating over new_blocks loaded from include file 19285 1727203907.82338: in VariableManager get_vars() 19285 1727203907.82492: done with get_vars() 19285 1727203907.82494: filtering new block on tags 19285 1727203907.82510: done filtering new block on tags 19285 1727203907.82513: in VariableManager get_vars() 19285 1727203907.82533: done with get_vars() 19285 1727203907.82535: filtering new block on tags 19285 1727203907.82554: done filtering new block on tags 19285 1727203907.82556: in VariableManager get_vars() 19285 1727203907.82593: done with get_vars() 19285 1727203907.82595: filtering new block on tags 19285 1727203907.82612: done filtering new block on tags 19285 1727203907.82615: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 19285 1727203907.82620: extending task lists for all hosts with included blocks 19285 1727203907.83065: done extending task lists 19285 1727203907.83067: done processing included files 19285 1727203907.83068: results queue empty 19285 1727203907.83068: checking for any_errors_fatal 19285 1727203907.83070: done checking for any_errors_fatal 19285 1727203907.83070: checking for max_fail_percentage 19285 1727203907.83071: done checking for max_fail_percentage 19285 1727203907.83072: checking to see if all hosts have failed and the running result is not ok 19285 1727203907.83073: done checking to see if all hosts have failed 19285 1727203907.83074: getting the remaining hosts for this loop 19285 1727203907.83076: done getting the remaining hosts for this loop 19285 1727203907.83079: getting the next task for host managed-node2 19285 1727203907.83082: done getting next task for host managed-node2 19285 1727203907.83085: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 19285 1727203907.83087: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203907.83096: getting variables 19285 1727203907.83097: in VariableManager get_vars() 19285 1727203907.83109: Calling all_inventory to load vars for managed-node2 19285 1727203907.83111: Calling groups_inventory to load vars for managed-node2 19285 1727203907.83113: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203907.83122: Calling all_plugins_play to load vars for managed-node2 19285 1727203907.83125: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203907.83128: Calling groups_plugins_play to load vars for managed-node2 19285 1727203907.83293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203907.83469: done with get_vars() 19285 1727203907.83480: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:51:47 -0400 (0:00:00.051) 0:00:06.909 ***** 19285 1727203907.83546: entering _queue_task() for managed-node2/setup 19285 1727203907.83897: worker is 1 (out of 1 available) 19285 1727203907.83911: exiting _queue_task() for managed-node2/setup 19285 1727203907.83924: done queuing things up, now waiting for results queue to drain 19285 1727203907.83925: waiting for pending results... 19285 1727203907.84230: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 19285 1727203907.84286: in run() - task 028d2410-947f-f31b-fb3f-00000000018d 19285 1727203907.84307: variable 'ansible_search_path' from source: unknown 19285 1727203907.84314: variable 'ansible_search_path' from source: unknown 19285 1727203907.84358: calling self._execute() 19285 1727203907.84445: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203907.84455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203907.84472: variable 'omit' from source: magic vars 19285 1727203907.84826: variable 'ansible_distribution_major_version' from source: facts 19285 1727203907.84870: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203907.85052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203907.88124: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203907.88204: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203907.88254: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203907.88369: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203907.88372: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203907.88453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203907.88498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203907.88534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203907.88582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203907.88615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203907.88675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203907.88741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203907.88744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203907.88881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203907.88884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203907.89001: variable '__network_required_facts' from source: role '' defaults 19285 1727203907.89017: variable 'ansible_facts' from source: unknown 19285 1727203907.89130: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 19285 1727203907.89152: when evaluation is False, skipping this task 19285 1727203907.89164: _execute() done 19285 1727203907.89172: dumping result to json 19285 1727203907.89183: done dumping result, returning 19285 1727203907.89196: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [028d2410-947f-f31b-fb3f-00000000018d] 19285 1727203907.89213: sending task result for task 028d2410-947f-f31b-fb3f-00000000018d skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19285 1727203907.89374: no more pending results, returning what we have 19285 1727203907.89481: results queue empty 19285 1727203907.89482: checking for any_errors_fatal 19285 1727203907.89485: done checking for any_errors_fatal 19285 1727203907.89486: checking for max_fail_percentage 19285 1727203907.89488: done checking for max_fail_percentage 19285 1727203907.89489: checking to see if all hosts have failed and the running result is not ok 19285 1727203907.89490: done checking to see if all hosts have failed 19285 1727203907.89490: getting the remaining hosts for this loop 19285 1727203907.89492: done getting the remaining hosts for this loop 19285 1727203907.89495: getting the next task for host managed-node2 19285 1727203907.89504: done getting next task for host managed-node2 19285 1727203907.89507: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 19285 1727203907.89510: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203907.89525: getting variables 19285 1727203907.89526: in VariableManager get_vars() 19285 1727203907.89564: Calling all_inventory to load vars for managed-node2 19285 1727203907.89567: Calling groups_inventory to load vars for managed-node2 19285 1727203907.89570: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203907.89583: Calling all_plugins_play to load vars for managed-node2 19285 1727203907.89585: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203907.89589: Calling groups_plugins_play to load vars for managed-node2 19285 1727203907.89780: done sending task result for task 028d2410-947f-f31b-fb3f-00000000018d 19285 1727203907.89783: WORKER PROCESS EXITING 19285 1727203907.89808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203907.90056: done with get_vars() 19285 1727203907.90070: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:51:47 -0400 (0:00:00.066) 0:00:06.976 ***** 19285 1727203907.90242: entering _queue_task() for managed-node2/stat 19285 1727203907.91105: worker is 1 (out of 1 available) 19285 1727203907.91118: exiting _queue_task() for managed-node2/stat 19285 1727203907.91131: done queuing things up, now waiting for results queue to drain 19285 1727203907.91132: waiting for pending results... 19285 1727203907.91866: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 19285 1727203907.91977: in run() - task 028d2410-947f-f31b-fb3f-00000000018f 19285 1727203907.92004: variable 'ansible_search_path' from source: unknown 19285 1727203907.92109: variable 'ansible_search_path' from source: unknown 19285 1727203907.92113: calling self._execute() 19285 1727203907.92179: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203907.92190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203907.92205: variable 'omit' from source: magic vars 19285 1727203907.93233: variable 'ansible_distribution_major_version' from source: facts 19285 1727203907.93256: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203907.93748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203907.94779: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203907.94935: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203907.94939: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203907.95071: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203907.95309: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203907.95396: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203907.95589: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203907.95592: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203907.95682: variable '__network_is_ostree' from source: set_fact 19285 1727203907.95697: Evaluated conditional (not __network_is_ostree is defined): False 19285 1727203907.95710: when evaluation is False, skipping this task 19285 1727203907.95718: _execute() done 19285 1727203907.95726: dumping result to json 19285 1727203907.95734: done dumping result, returning 19285 1727203907.95746: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [028d2410-947f-f31b-fb3f-00000000018f] 19285 1727203907.95756: sending task result for task 028d2410-947f-f31b-fb3f-00000000018f 19285 1727203907.95963: done sending task result for task 028d2410-947f-f31b-fb3f-00000000018f 19285 1727203907.95967: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 19285 1727203907.96048: no more pending results, returning what we have 19285 1727203907.96052: results queue empty 19285 1727203907.96057: checking for any_errors_fatal 19285 1727203907.96070: done checking for any_errors_fatal 19285 1727203907.96071: checking for max_fail_percentage 19285 1727203907.96073: done checking for max_fail_percentage 19285 1727203907.96074: checking to see if all hosts have failed and the running result is not ok 19285 1727203907.96077: done checking to see if all hosts have failed 19285 1727203907.96078: getting the remaining hosts for this loop 19285 1727203907.96080: done getting the remaining hosts for this loop 19285 1727203907.96086: getting the next task for host managed-node2 19285 1727203907.96098: done getting next task for host managed-node2 19285 1727203907.96105: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 19285 1727203907.96109: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203907.96122: getting variables 19285 1727203907.96123: in VariableManager get_vars() 19285 1727203907.96171: Calling all_inventory to load vars for managed-node2 19285 1727203907.96412: Calling groups_inventory to load vars for managed-node2 19285 1727203907.96416: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203907.96427: Calling all_plugins_play to load vars for managed-node2 19285 1727203907.96429: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203907.96433: Calling groups_plugins_play to load vars for managed-node2 19285 1727203907.96930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203907.97209: done with get_vars() 19285 1727203907.97219: done getting variables 19285 1727203907.97881: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:51:47 -0400 (0:00:00.076) 0:00:07.053 ***** 19285 1727203907.97924: entering _queue_task() for managed-node2/set_fact 19285 1727203907.98940: worker is 1 (out of 1 available) 19285 1727203907.98949: exiting _queue_task() for managed-node2/set_fact 19285 1727203907.98963: done queuing things up, now waiting for results queue to drain 19285 1727203907.98965: waiting for pending results... 19285 1727203907.99194: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 19285 1727203907.99369: in run() - task 028d2410-947f-f31b-fb3f-000000000190 19285 1727203907.99606: variable 'ansible_search_path' from source: unknown 19285 1727203907.99610: variable 'ansible_search_path' from source: unknown 19285 1727203907.99884: calling self._execute() 19285 1727203907.99922: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203907.99968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203907.99991: variable 'omit' from source: magic vars 19285 1727203908.00892: variable 'ansible_distribution_major_version' from source: facts 19285 1727203908.00910: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203908.01700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203908.03003: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203908.03009: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203908.03112: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203908.03155: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203908.03417: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203908.03655: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203908.03663: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203908.03666: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203908.04093: variable '__network_is_ostree' from source: set_fact 19285 1727203908.04097: Evaluated conditional (not __network_is_ostree is defined): False 19285 1727203908.04100: when evaluation is False, skipping this task 19285 1727203908.04102: _execute() done 19285 1727203908.04104: dumping result to json 19285 1727203908.04106: done dumping result, returning 19285 1727203908.04108: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [028d2410-947f-f31b-fb3f-000000000190] 19285 1727203908.04110: sending task result for task 028d2410-947f-f31b-fb3f-000000000190 19285 1727203908.04186: done sending task result for task 028d2410-947f-f31b-fb3f-000000000190 19285 1727203908.04190: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 19285 1727203908.04265: no more pending results, returning what we have 19285 1727203908.04269: results queue empty 19285 1727203908.04270: checking for any_errors_fatal 19285 1727203908.04280: done checking for any_errors_fatal 19285 1727203908.04281: checking for max_fail_percentage 19285 1727203908.04283: done checking for max_fail_percentage 19285 1727203908.04283: checking to see if all hosts have failed and the running result is not ok 19285 1727203908.04285: done checking to see if all hosts have failed 19285 1727203908.04285: getting the remaining hosts for this loop 19285 1727203908.04287: done getting the remaining hosts for this loop 19285 1727203908.04290: getting the next task for host managed-node2 19285 1727203908.04305: done getting next task for host managed-node2 19285 1727203908.04309: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 19285 1727203908.04312: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203908.04323: getting variables 19285 1727203908.04325: in VariableManager get_vars() 19285 1727203908.04364: Calling all_inventory to load vars for managed-node2 19285 1727203908.04366: Calling groups_inventory to load vars for managed-node2 19285 1727203908.04369: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203908.04812: Calling all_plugins_play to load vars for managed-node2 19285 1727203908.04816: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203908.04900: Calling groups_plugins_play to load vars for managed-node2 19285 1727203908.05638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203908.06482: done with get_vars() 19285 1727203908.06527: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:51:48 -0400 (0:00:00.089) 0:00:07.142 ***** 19285 1727203908.06843: entering _queue_task() for managed-node2/service_facts 19285 1727203908.06845: Creating lock for service_facts 19285 1727203908.07790: worker is 1 (out of 1 available) 19285 1727203908.07804: exiting _queue_task() for managed-node2/service_facts 19285 1727203908.07818: done queuing things up, now waiting for results queue to drain 19285 1727203908.07820: waiting for pending results... 19285 1727203908.08523: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 19285 1727203908.09160: in run() - task 028d2410-947f-f31b-fb3f-000000000192 19285 1727203908.09165: variable 'ansible_search_path' from source: unknown 19285 1727203908.09167: variable 'ansible_search_path' from source: unknown 19285 1727203908.09381: calling self._execute() 19285 1727203908.09475: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203908.09670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203908.09890: variable 'omit' from source: magic vars 19285 1727203908.10651: variable 'ansible_distribution_major_version' from source: facts 19285 1727203908.10981: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203908.10984: variable 'omit' from source: magic vars 19285 1727203908.10987: variable 'omit' from source: magic vars 19285 1727203908.11094: variable 'omit' from source: magic vars 19285 1727203908.11144: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203908.11229: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203908.11328: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203908.11352: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203908.11373: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203908.11445: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203908.11529: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203908.11537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203908.11736: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203908.12073: Set connection var ansible_pipelining to False 19285 1727203908.12078: Set connection var ansible_timeout to 10 19285 1727203908.12081: Set connection var ansible_shell_type to sh 19285 1727203908.12083: Set connection var ansible_shell_executable to /bin/sh 19285 1727203908.12085: Set connection var ansible_connection to ssh 19285 1727203908.12087: variable 'ansible_shell_executable' from source: unknown 19285 1727203908.12089: variable 'ansible_connection' from source: unknown 19285 1727203908.12091: variable 'ansible_module_compression' from source: unknown 19285 1727203908.12093: variable 'ansible_shell_type' from source: unknown 19285 1727203908.12095: variable 'ansible_shell_executable' from source: unknown 19285 1727203908.12097: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203908.12099: variable 'ansible_pipelining' from source: unknown 19285 1727203908.12101: variable 'ansible_timeout' from source: unknown 19285 1727203908.12102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203908.12701: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19285 1727203908.12719: variable 'omit' from source: magic vars 19285 1727203908.12729: starting attempt loop 19285 1727203908.12743: running the handler 19285 1727203908.12766: _low_level_execute_command(): starting 19285 1727203908.12780: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203908.14422: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203908.14904: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203908.15017: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203908.15189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203908.16816: stdout chunk (state=3): >>>/root <<< 19285 1727203908.16967: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203908.17001: stderr chunk (state=3): >>><<< 19285 1727203908.17031: stdout chunk (state=3): >>><<< 19285 1727203908.17153: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203908.17172: _low_level_execute_command(): starting 19285 1727203908.17203: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203908.1715953-20012-80199376062615 `" && echo ansible-tmp-1727203908.1715953-20012-80199376062615="` echo /root/.ansible/tmp/ansible-tmp-1727203908.1715953-20012-80199376062615 `" ) && sleep 0' 19285 1727203908.18089: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203908.18126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203908.18149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203908.18238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203908.18267: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203908.18306: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203908.18394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203908.20357: stdout chunk (state=3): >>>ansible-tmp-1727203908.1715953-20012-80199376062615=/root/.ansible/tmp/ansible-tmp-1727203908.1715953-20012-80199376062615 <<< 19285 1727203908.20498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203908.20508: stdout chunk (state=3): >>><<< 19285 1727203908.20519: stderr chunk (state=3): >>><<< 19285 1727203908.20550: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203908.1715953-20012-80199376062615=/root/.ansible/tmp/ansible-tmp-1727203908.1715953-20012-80199376062615 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203908.20606: variable 'ansible_module_compression' from source: unknown 19285 1727203908.20664: ANSIBALLZ: Using lock for service_facts 19285 1727203908.20744: ANSIBALLZ: Acquiring lock 19285 1727203908.20747: ANSIBALLZ: Lock acquired: 140487235943408 19285 1727203908.20749: ANSIBALLZ: Creating module 19285 1727203908.52235: ANSIBALLZ: Writing module into payload 19285 1727203908.52384: ANSIBALLZ: Writing module 19285 1727203908.52387: ANSIBALLZ: Renaming module 19285 1727203908.52390: ANSIBALLZ: Done creating module 19285 1727203908.52398: variable 'ansible_facts' from source: unknown 19285 1727203908.52555: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203908.1715953-20012-80199376062615/AnsiballZ_service_facts.py 19285 1727203908.52770: Sending initial data 19285 1727203908.52774: Sent initial data (161 bytes) 19285 1727203908.53704: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203908.53752: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203908.53767: stderr chunk (state=3): >>>debug2: match not found <<< 19285 1727203908.53791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203908.53980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203908.54049: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203908.54202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203908.55988: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203908.56436: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203908.56527: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmpdnwi4pi5 /root/.ansible/tmp/ansible-tmp-1727203908.1715953-20012-80199376062615/AnsiballZ_service_facts.py <<< 19285 1727203908.56531: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203908.1715953-20012-80199376062615/AnsiballZ_service_facts.py" <<< 19285 1727203908.56790: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmpdnwi4pi5" to remote "/root/.ansible/tmp/ansible-tmp-1727203908.1715953-20012-80199376062615/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203908.1715953-20012-80199376062615/AnsiballZ_service_facts.py" <<< 19285 1727203908.58413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203908.58416: stdout chunk (state=3): >>><<< 19285 1727203908.58429: stderr chunk (state=3): >>><<< 19285 1727203908.58601: done transferring module to remote 19285 1727203908.58605: _low_level_execute_command(): starting 19285 1727203908.58607: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203908.1715953-20012-80199376062615/ /root/.ansible/tmp/ansible-tmp-1727203908.1715953-20012-80199376062615/AnsiballZ_service_facts.py && sleep 0' 19285 1727203908.59140: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203908.59149: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203908.59159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203908.59178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203908.59190: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203908.59307: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203908.59315: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203908.59413: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203908.61593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203908.61596: stdout chunk (state=3): >>><<< 19285 1727203908.61598: stderr chunk (state=3): >>><<< 19285 1727203908.61601: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203908.61603: _low_level_execute_command(): starting 19285 1727203908.61605: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203908.1715953-20012-80199376062615/AnsiballZ_service_facts.py && sleep 0' 19285 1727203908.62548: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203908.62562: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203908.62581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203908.62683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203908.62705: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203908.62824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203910.19204: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 19285 1727203910.20614: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203910.20701: stderr chunk (state=3): >>><<< 19285 1727203910.20705: stdout chunk (state=3): >>><<< 19285 1727203910.20710: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203910.21496: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203908.1715953-20012-80199376062615/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203910.21505: _low_level_execute_command(): starting 19285 1727203910.21510: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203908.1715953-20012-80199376062615/ > /dev/null 2>&1 && sleep 0' 19285 1727203910.22091: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203910.22101: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203910.22112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203910.22126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203910.22139: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203910.22146: stderr chunk (state=3): >>>debug2: match not found <<< 19285 1727203910.22156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203910.22188: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19285 1727203910.22192: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 19285 1727203910.22194: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19285 1727203910.22196: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203910.22267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203910.22270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203910.22272: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203910.22275: stderr chunk (state=3): >>>debug2: match found <<< 19285 1727203910.22278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203910.22305: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203910.22326: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203910.22339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203910.22439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203910.24483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203910.24486: stderr chunk (state=3): >>><<< 19285 1727203910.24489: stdout chunk (state=3): >>><<< 19285 1727203910.24492: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203910.24494: handler run complete 19285 1727203910.24720: variable 'ansible_facts' from source: unknown 19285 1727203910.24867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203910.25384: variable 'ansible_facts' from source: unknown 19285 1727203910.25521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203910.25715: attempt loop complete, returning result 19285 1727203910.25724: _execute() done 19285 1727203910.25729: dumping result to json 19285 1727203910.25787: done dumping result, returning 19285 1727203910.25801: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [028d2410-947f-f31b-fb3f-000000000192] 19285 1727203910.25812: sending task result for task 028d2410-947f-f31b-fb3f-000000000192 19285 1727203910.27153: done sending task result for task 028d2410-947f-f31b-fb3f-000000000192 19285 1727203910.27157: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19285 1727203910.27257: no more pending results, returning what we have 19285 1727203910.27260: results queue empty 19285 1727203910.27260: checking for any_errors_fatal 19285 1727203910.27263: done checking for any_errors_fatal 19285 1727203910.27264: checking for max_fail_percentage 19285 1727203910.27266: done checking for max_fail_percentage 19285 1727203910.27266: checking to see if all hosts have failed and the running result is not ok 19285 1727203910.27267: done checking to see if all hosts have failed 19285 1727203910.27268: getting the remaining hosts for this loop 19285 1727203910.27269: done getting the remaining hosts for this loop 19285 1727203910.27272: getting the next task for host managed-node2 19285 1727203910.27279: done getting next task for host managed-node2 19285 1727203910.27282: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 19285 1727203910.27285: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203910.27295: getting variables 19285 1727203910.27296: in VariableManager get_vars() 19285 1727203910.27324: Calling all_inventory to load vars for managed-node2 19285 1727203910.27327: Calling groups_inventory to load vars for managed-node2 19285 1727203910.27329: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203910.27338: Calling all_plugins_play to load vars for managed-node2 19285 1727203910.27341: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203910.27344: Calling groups_plugins_play to load vars for managed-node2 19285 1727203910.27729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203910.28257: done with get_vars() 19285 1727203910.28270: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:51:50 -0400 (0:00:02.215) 0:00:09.358 ***** 19285 1727203910.28396: entering _queue_task() for managed-node2/package_facts 19285 1727203910.28397: Creating lock for package_facts 19285 1727203910.28632: worker is 1 (out of 1 available) 19285 1727203910.28646: exiting _queue_task() for managed-node2/package_facts 19285 1727203910.28662: done queuing things up, now waiting for results queue to drain 19285 1727203910.28664: waiting for pending results... 19285 1727203910.28816: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 19285 1727203910.28901: in run() - task 028d2410-947f-f31b-fb3f-000000000193 19285 1727203910.28912: variable 'ansible_search_path' from source: unknown 19285 1727203910.28916: variable 'ansible_search_path' from source: unknown 19285 1727203910.28943: calling self._execute() 19285 1727203910.29008: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203910.29012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203910.29019: variable 'omit' from source: magic vars 19285 1727203910.29280: variable 'ansible_distribution_major_version' from source: facts 19285 1727203910.29290: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203910.29296: variable 'omit' from source: magic vars 19285 1727203910.29331: variable 'omit' from source: magic vars 19285 1727203910.29359: variable 'omit' from source: magic vars 19285 1727203910.29394: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203910.29422: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203910.29442: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203910.29456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203910.29467: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203910.29491: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203910.29495: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203910.29497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203910.29570: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203910.29577: Set connection var ansible_pipelining to False 19285 1727203910.29582: Set connection var ansible_timeout to 10 19285 1727203910.29585: Set connection var ansible_shell_type to sh 19285 1727203910.29592: Set connection var ansible_shell_executable to /bin/sh 19285 1727203910.29595: Set connection var ansible_connection to ssh 19285 1727203910.29609: variable 'ansible_shell_executable' from source: unknown 19285 1727203910.29612: variable 'ansible_connection' from source: unknown 19285 1727203910.29615: variable 'ansible_module_compression' from source: unknown 19285 1727203910.29617: variable 'ansible_shell_type' from source: unknown 19285 1727203910.29619: variable 'ansible_shell_executable' from source: unknown 19285 1727203910.29621: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203910.29625: variable 'ansible_pipelining' from source: unknown 19285 1727203910.29628: variable 'ansible_timeout' from source: unknown 19285 1727203910.29632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203910.29774: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19285 1727203910.29784: variable 'omit' from source: magic vars 19285 1727203910.29787: starting attempt loop 19285 1727203910.29790: running the handler 19285 1727203910.29802: _low_level_execute_command(): starting 19285 1727203910.29809: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203910.30496: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203910.30522: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203910.30537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203910.30644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203910.32344: stdout chunk (state=3): >>>/root <<< 19285 1727203910.32438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203910.32471: stderr chunk (state=3): >>><<< 19285 1727203910.32477: stdout chunk (state=3): >>><<< 19285 1727203910.32496: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203910.32508: _low_level_execute_command(): starting 19285 1727203910.32514: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203910.3249645-20398-11559868312874 `" && echo ansible-tmp-1727203910.3249645-20398-11559868312874="` echo /root/.ansible/tmp/ansible-tmp-1727203910.3249645-20398-11559868312874 `" ) && sleep 0' 19285 1727203910.33056: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203910.33063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203910.33066: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203910.33078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203910.33094: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203910.33123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203910.33190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203910.35152: stdout chunk (state=3): >>>ansible-tmp-1727203910.3249645-20398-11559868312874=/root/.ansible/tmp/ansible-tmp-1727203910.3249645-20398-11559868312874 <<< 19285 1727203910.35311: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203910.35315: stdout chunk (state=3): >>><<< 19285 1727203910.35317: stderr chunk (state=3): >>><<< 19285 1727203910.35381: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203910.3249645-20398-11559868312874=/root/.ansible/tmp/ansible-tmp-1727203910.3249645-20398-11559868312874 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203910.35398: variable 'ansible_module_compression' from source: unknown 19285 1727203910.35465: ANSIBALLZ: Using lock for package_facts 19285 1727203910.35474: ANSIBALLZ: Acquiring lock 19285 1727203910.35484: ANSIBALLZ: Lock acquired: 140487238675696 19285 1727203910.35492: ANSIBALLZ: Creating module 19285 1727203910.70358: ANSIBALLZ: Writing module into payload 19285 1727203910.70587: ANSIBALLZ: Writing module 19285 1727203910.70591: ANSIBALLZ: Renaming module 19285 1727203910.70594: ANSIBALLZ: Done creating module 19285 1727203910.70842: variable 'ansible_facts' from source: unknown 19285 1727203910.71046: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203910.3249645-20398-11559868312874/AnsiballZ_package_facts.py 19285 1727203910.71284: Sending initial data 19285 1727203910.71287: Sent initial data (161 bytes) 19285 1727203910.71849: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203910.71866: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203910.71893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203910.71913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203910.72012: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203910.72040: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203910.72151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203910.73789: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 19285 1727203910.73808: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203910.73909: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203910.73991: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmpvyfjikek /root/.ansible/tmp/ansible-tmp-1727203910.3249645-20398-11559868312874/AnsiballZ_package_facts.py <<< 19285 1727203910.73995: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203910.3249645-20398-11559868312874/AnsiballZ_package_facts.py" <<< 19285 1727203910.74056: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmpvyfjikek" to remote "/root/.ansible/tmp/ansible-tmp-1727203910.3249645-20398-11559868312874/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203910.3249645-20398-11559868312874/AnsiballZ_package_facts.py" <<< 19285 1727203910.75783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203910.75811: stderr chunk (state=3): >>><<< 19285 1727203910.75825: stdout chunk (state=3): >>><<< 19285 1727203910.75930: done transferring module to remote 19285 1727203910.75933: _low_level_execute_command(): starting 19285 1727203910.75936: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203910.3249645-20398-11559868312874/ /root/.ansible/tmp/ansible-tmp-1727203910.3249645-20398-11559868312874/AnsiballZ_package_facts.py && sleep 0' 19285 1727203910.76835: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203910.76844: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203910.76855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203910.76870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203910.76889: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203910.76936: stderr chunk (state=3): >>>debug2: match not found <<< 19285 1727203910.76939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203910.76942: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19285 1727203910.76949: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 19285 1727203910.76951: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19285 1727203910.76954: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203910.76956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203910.76959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203910.76964: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203910.76966: stderr chunk (state=3): >>>debug2: match found <<< 19285 1727203910.77044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203910.77047: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203910.77062: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203910.77086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203910.77186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203910.79013: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203910.79044: stderr chunk (state=3): >>><<< 19285 1727203910.79054: stdout chunk (state=3): >>><<< 19285 1727203910.79083: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203910.79094: _low_level_execute_command(): starting 19285 1727203910.79104: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203910.3249645-20398-11559868312874/AnsiballZ_package_facts.py && sleep 0' 19285 1727203910.79740: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203910.79755: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203910.79772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203910.79794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203910.79812: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203910.79825: stderr chunk (state=3): >>>debug2: match not found <<< 19285 1727203910.79843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203910.79866: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19285 1727203910.79951: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203910.79971: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203910.79993: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203910.80009: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203910.80117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203911.24821: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 19285 1727203911.24966: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name":<<< 19285 1727203911.24986: stdout chunk (state=3): >>> "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch<<< 19285 1727203911.25024: stdout chunk (state=3): >>>": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 19285 1727203911.26686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203911.26712: stderr chunk (state=3): >>><<< 19285 1727203911.26717: stdout chunk (state=3): >>><<< 19285 1727203911.26752: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203911.28378: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203910.3249645-20398-11559868312874/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203911.28398: _low_level_execute_command(): starting 19285 1727203911.28402: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203910.3249645-20398-11559868312874/ > /dev/null 2>&1 && sleep 0' 19285 1727203911.29019: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203911.29058: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203911.29130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203911.30993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203911.31014: stderr chunk (state=3): >>><<< 19285 1727203911.31017: stdout chunk (state=3): >>><<< 19285 1727203911.31031: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203911.31037: handler run complete 19285 1727203911.31493: variable 'ansible_facts' from source: unknown 19285 1727203911.31840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203911.33609: variable 'ansible_facts' from source: unknown 19285 1727203911.33898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203911.34291: attempt loop complete, returning result 19285 1727203911.34301: _execute() done 19285 1727203911.34303: dumping result to json 19285 1727203911.34419: done dumping result, returning 19285 1727203911.34427: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [028d2410-947f-f31b-fb3f-000000000193] 19285 1727203911.34432: sending task result for task 028d2410-947f-f31b-fb3f-000000000193 19285 1727203911.35650: done sending task result for task 028d2410-947f-f31b-fb3f-000000000193 19285 1727203911.35653: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19285 1727203911.35695: no more pending results, returning what we have 19285 1727203911.35697: results queue empty 19285 1727203911.35697: checking for any_errors_fatal 19285 1727203911.35700: done checking for any_errors_fatal 19285 1727203911.35701: checking for max_fail_percentage 19285 1727203911.35702: done checking for max_fail_percentage 19285 1727203911.35702: checking to see if all hosts have failed and the running result is not ok 19285 1727203911.35703: done checking to see if all hosts have failed 19285 1727203911.35704: getting the remaining hosts for this loop 19285 1727203911.35704: done getting the remaining hosts for this loop 19285 1727203911.35707: getting the next task for host managed-node2 19285 1727203911.35713: done getting next task for host managed-node2 19285 1727203911.35716: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 19285 1727203911.35722: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203911.35735: getting variables 19285 1727203911.35737: in VariableManager get_vars() 19285 1727203911.35771: Calling all_inventory to load vars for managed-node2 19285 1727203911.35774: Calling groups_inventory to load vars for managed-node2 19285 1727203911.35778: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203911.35786: Calling all_plugins_play to load vars for managed-node2 19285 1727203911.35789: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203911.35791: Calling groups_plugins_play to load vars for managed-node2 19285 1727203911.36942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203911.37805: done with get_vars() 19285 1727203911.37823: done getting variables 19285 1727203911.37870: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:51:51 -0400 (0:00:01.094) 0:00:10.453 ***** 19285 1727203911.37893: entering _queue_task() for managed-node2/debug 19285 1727203911.38130: worker is 1 (out of 1 available) 19285 1727203911.38144: exiting _queue_task() for managed-node2/debug 19285 1727203911.38154: done queuing things up, now waiting for results queue to drain 19285 1727203911.38156: waiting for pending results... 19285 1727203911.38317: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 19285 1727203911.38379: in run() - task 028d2410-947f-f31b-fb3f-000000000015 19285 1727203911.38393: variable 'ansible_search_path' from source: unknown 19285 1727203911.38396: variable 'ansible_search_path' from source: unknown 19285 1727203911.38423: calling self._execute() 19285 1727203911.38492: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203911.38496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203911.38507: variable 'omit' from source: magic vars 19285 1727203911.38767: variable 'ansible_distribution_major_version' from source: facts 19285 1727203911.38779: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203911.38785: variable 'omit' from source: magic vars 19285 1727203911.38810: variable 'omit' from source: magic vars 19285 1727203911.38880: variable 'network_provider' from source: set_fact 19285 1727203911.38894: variable 'omit' from source: magic vars 19285 1727203911.38927: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203911.38957: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203911.38974: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203911.38988: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203911.38998: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203911.39021: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203911.39025: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203911.39027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203911.39100: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203911.39106: Set connection var ansible_pipelining to False 19285 1727203911.39111: Set connection var ansible_timeout to 10 19285 1727203911.39114: Set connection var ansible_shell_type to sh 19285 1727203911.39120: Set connection var ansible_shell_executable to /bin/sh 19285 1727203911.39123: Set connection var ansible_connection to ssh 19285 1727203911.39137: variable 'ansible_shell_executable' from source: unknown 19285 1727203911.39140: variable 'ansible_connection' from source: unknown 19285 1727203911.39143: variable 'ansible_module_compression' from source: unknown 19285 1727203911.39145: variable 'ansible_shell_type' from source: unknown 19285 1727203911.39147: variable 'ansible_shell_executable' from source: unknown 19285 1727203911.39151: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203911.39153: variable 'ansible_pipelining' from source: unknown 19285 1727203911.39156: variable 'ansible_timeout' from source: unknown 19285 1727203911.39169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203911.39261: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203911.39267: variable 'omit' from source: magic vars 19285 1727203911.39280: starting attempt loop 19285 1727203911.39284: running the handler 19285 1727203911.39312: handler run complete 19285 1727203911.39322: attempt loop complete, returning result 19285 1727203911.39325: _execute() done 19285 1727203911.39328: dumping result to json 19285 1727203911.39330: done dumping result, returning 19285 1727203911.39336: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [028d2410-947f-f31b-fb3f-000000000015] 19285 1727203911.39340: sending task result for task 028d2410-947f-f31b-fb3f-000000000015 19285 1727203911.39418: done sending task result for task 028d2410-947f-f31b-fb3f-000000000015 19285 1727203911.39421: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 19285 1727203911.39478: no more pending results, returning what we have 19285 1727203911.39481: results queue empty 19285 1727203911.39482: checking for any_errors_fatal 19285 1727203911.39491: done checking for any_errors_fatal 19285 1727203911.39492: checking for max_fail_percentage 19285 1727203911.39494: done checking for max_fail_percentage 19285 1727203911.39494: checking to see if all hosts have failed and the running result is not ok 19285 1727203911.39495: done checking to see if all hosts have failed 19285 1727203911.39496: getting the remaining hosts for this loop 19285 1727203911.39497: done getting the remaining hosts for this loop 19285 1727203911.39501: getting the next task for host managed-node2 19285 1727203911.39507: done getting next task for host managed-node2 19285 1727203911.39511: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 19285 1727203911.39512: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203911.39521: getting variables 19285 1727203911.39523: in VariableManager get_vars() 19285 1727203911.39554: Calling all_inventory to load vars for managed-node2 19285 1727203911.39557: Calling groups_inventory to load vars for managed-node2 19285 1727203911.39562: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203911.39570: Calling all_plugins_play to load vars for managed-node2 19285 1727203911.39573: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203911.39582: Calling groups_plugins_play to load vars for managed-node2 19285 1727203911.40327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203911.41188: done with get_vars() 19285 1727203911.41204: done getting variables 19285 1727203911.41269: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:51:51 -0400 (0:00:00.033) 0:00:10.487 ***** 19285 1727203911.41292: entering _queue_task() for managed-node2/fail 19285 1727203911.41293: Creating lock for fail 19285 1727203911.41514: worker is 1 (out of 1 available) 19285 1727203911.41527: exiting _queue_task() for managed-node2/fail 19285 1727203911.41539: done queuing things up, now waiting for results queue to drain 19285 1727203911.41540: waiting for pending results... 19285 1727203911.41690: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 19285 1727203911.41765: in run() - task 028d2410-947f-f31b-fb3f-000000000016 19285 1727203911.41779: variable 'ansible_search_path' from source: unknown 19285 1727203911.41783: variable 'ansible_search_path' from source: unknown 19285 1727203911.41807: calling self._execute() 19285 1727203911.41869: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203911.41873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203911.41882: variable 'omit' from source: magic vars 19285 1727203911.42148: variable 'ansible_distribution_major_version' from source: facts 19285 1727203911.42157: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203911.42241: variable 'network_state' from source: role '' defaults 19285 1727203911.42250: Evaluated conditional (network_state != {}): False 19285 1727203911.42254: when evaluation is False, skipping this task 19285 1727203911.42257: _execute() done 19285 1727203911.42260: dumping result to json 19285 1727203911.42262: done dumping result, returning 19285 1727203911.42270: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [028d2410-947f-f31b-fb3f-000000000016] 19285 1727203911.42277: sending task result for task 028d2410-947f-f31b-fb3f-000000000016 19285 1727203911.42353: done sending task result for task 028d2410-947f-f31b-fb3f-000000000016 19285 1727203911.42356: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19285 1727203911.42403: no more pending results, returning what we have 19285 1727203911.42407: results queue empty 19285 1727203911.42408: checking for any_errors_fatal 19285 1727203911.42415: done checking for any_errors_fatal 19285 1727203911.42416: checking for max_fail_percentage 19285 1727203911.42417: done checking for max_fail_percentage 19285 1727203911.42418: checking to see if all hosts have failed and the running result is not ok 19285 1727203911.42419: done checking to see if all hosts have failed 19285 1727203911.42419: getting the remaining hosts for this loop 19285 1727203911.42420: done getting the remaining hosts for this loop 19285 1727203911.42424: getting the next task for host managed-node2 19285 1727203911.42431: done getting next task for host managed-node2 19285 1727203911.42435: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 19285 1727203911.42437: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203911.42449: getting variables 19285 1727203911.42450: in VariableManager get_vars() 19285 1727203911.42488: Calling all_inventory to load vars for managed-node2 19285 1727203911.42491: Calling groups_inventory to load vars for managed-node2 19285 1727203911.42493: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203911.42501: Calling all_plugins_play to load vars for managed-node2 19285 1727203911.42503: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203911.42505: Calling groups_plugins_play to load vars for managed-node2 19285 1727203911.43330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203911.44166: done with get_vars() 19285 1727203911.44186: done getting variables 19285 1727203911.44230: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:51:51 -0400 (0:00:00.029) 0:00:10.516 ***** 19285 1727203911.44250: entering _queue_task() for managed-node2/fail 19285 1727203911.44459: worker is 1 (out of 1 available) 19285 1727203911.44472: exiting _queue_task() for managed-node2/fail 19285 1727203911.44486: done queuing things up, now waiting for results queue to drain 19285 1727203911.44487: waiting for pending results... 19285 1727203911.44643: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 19285 1727203911.44708: in run() - task 028d2410-947f-f31b-fb3f-000000000017 19285 1727203911.44721: variable 'ansible_search_path' from source: unknown 19285 1727203911.44725: variable 'ansible_search_path' from source: unknown 19285 1727203911.44750: calling self._execute() 19285 1727203911.44816: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203911.44823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203911.44834: variable 'omit' from source: magic vars 19285 1727203911.45097: variable 'ansible_distribution_major_version' from source: facts 19285 1727203911.45106: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203911.45188: variable 'network_state' from source: role '' defaults 19285 1727203911.45196: Evaluated conditional (network_state != {}): False 19285 1727203911.45199: when evaluation is False, skipping this task 19285 1727203911.45202: _execute() done 19285 1727203911.45204: dumping result to json 19285 1727203911.45207: done dumping result, returning 19285 1727203911.45214: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [028d2410-947f-f31b-fb3f-000000000017] 19285 1727203911.45218: sending task result for task 028d2410-947f-f31b-fb3f-000000000017 19285 1727203911.45300: done sending task result for task 028d2410-947f-f31b-fb3f-000000000017 19285 1727203911.45303: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19285 1727203911.45347: no more pending results, returning what we have 19285 1727203911.45350: results queue empty 19285 1727203911.45351: checking for any_errors_fatal 19285 1727203911.45359: done checking for any_errors_fatal 19285 1727203911.45360: checking for max_fail_percentage 19285 1727203911.45362: done checking for max_fail_percentage 19285 1727203911.45362: checking to see if all hosts have failed and the running result is not ok 19285 1727203911.45363: done checking to see if all hosts have failed 19285 1727203911.45364: getting the remaining hosts for this loop 19285 1727203911.45365: done getting the remaining hosts for this loop 19285 1727203911.45369: getting the next task for host managed-node2 19285 1727203911.45377: done getting next task for host managed-node2 19285 1727203911.45381: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 19285 1727203911.45383: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203911.45397: getting variables 19285 1727203911.45399: in VariableManager get_vars() 19285 1727203911.45431: Calling all_inventory to load vars for managed-node2 19285 1727203911.45433: Calling groups_inventory to load vars for managed-node2 19285 1727203911.45435: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203911.45443: Calling all_plugins_play to load vars for managed-node2 19285 1727203911.45445: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203911.45447: Calling groups_plugins_play to load vars for managed-node2 19285 1727203911.46189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203911.47037: done with get_vars() 19285 1727203911.47055: done getting variables 19285 1727203911.47099: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:51:51 -0400 (0:00:00.028) 0:00:10.545 ***** 19285 1727203911.47123: entering _queue_task() for managed-node2/fail 19285 1727203911.47342: worker is 1 (out of 1 available) 19285 1727203911.47358: exiting _queue_task() for managed-node2/fail 19285 1727203911.47369: done queuing things up, now waiting for results queue to drain 19285 1727203911.47371: waiting for pending results... 19285 1727203911.47531: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 19285 1727203911.47601: in run() - task 028d2410-947f-f31b-fb3f-000000000018 19285 1727203911.47610: variable 'ansible_search_path' from source: unknown 19285 1727203911.47614: variable 'ansible_search_path' from source: unknown 19285 1727203911.47641: calling self._execute() 19285 1727203911.47712: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203911.47715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203911.47725: variable 'omit' from source: magic vars 19285 1727203911.47990: variable 'ansible_distribution_major_version' from source: facts 19285 1727203911.47999: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203911.48119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203911.49837: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203911.49892: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203911.49921: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203911.49945: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203911.49967: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203911.50027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203911.50047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203911.50066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203911.50095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203911.50108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203911.50174: variable 'ansible_distribution_major_version' from source: facts 19285 1727203911.50187: Evaluated conditional (ansible_distribution_major_version | int > 9): True 19285 1727203911.50268: variable 'ansible_distribution' from source: facts 19285 1727203911.50271: variable '__network_rh_distros' from source: role '' defaults 19285 1727203911.50281: Evaluated conditional (ansible_distribution in __network_rh_distros): True 19285 1727203911.50443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203911.50461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203911.50481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203911.50506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203911.50518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203911.50552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203911.50571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203911.50589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203911.50613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203911.50623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203911.50655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203911.50675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203911.50693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203911.50716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203911.50726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203911.50908: variable 'network_connections' from source: play vars 19285 1727203911.50917: variable 'interface' from source: set_fact 19285 1727203911.50969: variable 'interface' from source: set_fact 19285 1727203911.50980: variable 'interface' from source: set_fact 19285 1727203911.51020: variable 'interface' from source: set_fact 19285 1727203911.51028: variable 'network_state' from source: role '' defaults 19285 1727203911.51077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203911.51188: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203911.51215: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203911.51247: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203911.51271: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203911.51307: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203911.51324: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203911.51341: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203911.51358: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203911.51390: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 19285 1727203911.51394: when evaluation is False, skipping this task 19285 1727203911.51396: _execute() done 19285 1727203911.51398: dumping result to json 19285 1727203911.51400: done dumping result, returning 19285 1727203911.51409: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [028d2410-947f-f31b-fb3f-000000000018] 19285 1727203911.51412: sending task result for task 028d2410-947f-f31b-fb3f-000000000018 19285 1727203911.51495: done sending task result for task 028d2410-947f-f31b-fb3f-000000000018 19285 1727203911.51498: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 19285 1727203911.51542: no more pending results, returning what we have 19285 1727203911.51546: results queue empty 19285 1727203911.51547: checking for any_errors_fatal 19285 1727203911.51552: done checking for any_errors_fatal 19285 1727203911.51553: checking for max_fail_percentage 19285 1727203911.51555: done checking for max_fail_percentage 19285 1727203911.51555: checking to see if all hosts have failed and the running result is not ok 19285 1727203911.51556: done checking to see if all hosts have failed 19285 1727203911.51557: getting the remaining hosts for this loop 19285 1727203911.51558: done getting the remaining hosts for this loop 19285 1727203911.51562: getting the next task for host managed-node2 19285 1727203911.51569: done getting next task for host managed-node2 19285 1727203911.51573: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 19285 1727203911.51574: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203911.51588: getting variables 19285 1727203911.51590: in VariableManager get_vars() 19285 1727203911.51625: Calling all_inventory to load vars for managed-node2 19285 1727203911.51628: Calling groups_inventory to load vars for managed-node2 19285 1727203911.51631: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203911.51640: Calling all_plugins_play to load vars for managed-node2 19285 1727203911.51642: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203911.51644: Calling groups_plugins_play to load vars for managed-node2 19285 1727203911.52531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203911.53374: done with get_vars() 19285 1727203911.53392: done getting variables 19285 1727203911.53467: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:51:51 -0400 (0:00:00.063) 0:00:10.609 ***** 19285 1727203911.53491: entering _queue_task() for managed-node2/dnf 19285 1727203911.53729: worker is 1 (out of 1 available) 19285 1727203911.53742: exiting _queue_task() for managed-node2/dnf 19285 1727203911.53754: done queuing things up, now waiting for results queue to drain 19285 1727203911.53755: waiting for pending results... 19285 1727203911.53926: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 19285 1727203911.53996: in run() - task 028d2410-947f-f31b-fb3f-000000000019 19285 1727203911.54007: variable 'ansible_search_path' from source: unknown 19285 1727203911.54011: variable 'ansible_search_path' from source: unknown 19285 1727203911.54041: calling self._execute() 19285 1727203911.54110: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203911.54114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203911.54123: variable 'omit' from source: magic vars 19285 1727203911.54407: variable 'ansible_distribution_major_version' from source: facts 19285 1727203911.54415: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203911.54549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203911.56071: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203911.56278: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203911.56282: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203911.56284: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203911.56287: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203911.56328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203911.56391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203911.56427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203911.56480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203911.56511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203911.56642: variable 'ansible_distribution' from source: facts 19285 1727203911.56653: variable 'ansible_distribution_major_version' from source: facts 19285 1727203911.56674: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 19285 1727203911.56806: variable '__network_wireless_connections_defined' from source: role '' defaults 19285 1727203911.56965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203911.56997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203911.57029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203911.57064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203911.57111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203911.57122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203911.57144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203911.57157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203911.57187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203911.57198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203911.57225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203911.57240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203911.57262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203911.57294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203911.57304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203911.57405: variable 'network_connections' from source: play vars 19285 1727203911.57415: variable 'interface' from source: set_fact 19285 1727203911.57464: variable 'interface' from source: set_fact 19285 1727203911.57479: variable 'interface' from source: set_fact 19285 1727203911.57521: variable 'interface' from source: set_fact 19285 1727203911.57566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203911.57696: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203911.57723: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203911.57744: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203911.57766: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203911.57802: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203911.57820: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203911.57842: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203911.57862: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203911.57907: variable '__network_team_connections_defined' from source: role '' defaults 19285 1727203911.58054: variable 'network_connections' from source: play vars 19285 1727203911.58057: variable 'interface' from source: set_fact 19285 1727203911.58102: variable 'interface' from source: set_fact 19285 1727203911.58108: variable 'interface' from source: set_fact 19285 1727203911.58152: variable 'interface' from source: set_fact 19285 1727203911.58178: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 19285 1727203911.58181: when evaluation is False, skipping this task 19285 1727203911.58184: _execute() done 19285 1727203911.58186: dumping result to json 19285 1727203911.58188: done dumping result, returning 19285 1727203911.58196: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [028d2410-947f-f31b-fb3f-000000000019] 19285 1727203911.58201: sending task result for task 028d2410-947f-f31b-fb3f-000000000019 19285 1727203911.58293: done sending task result for task 028d2410-947f-f31b-fb3f-000000000019 19285 1727203911.58295: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 19285 1727203911.58345: no more pending results, returning what we have 19285 1727203911.58349: results queue empty 19285 1727203911.58350: checking for any_errors_fatal 19285 1727203911.58355: done checking for any_errors_fatal 19285 1727203911.58356: checking for max_fail_percentage 19285 1727203911.58357: done checking for max_fail_percentage 19285 1727203911.58358: checking to see if all hosts have failed and the running result is not ok 19285 1727203911.58359: done checking to see if all hosts have failed 19285 1727203911.58361: getting the remaining hosts for this loop 19285 1727203911.58363: done getting the remaining hosts for this loop 19285 1727203911.58366: getting the next task for host managed-node2 19285 1727203911.58373: done getting next task for host managed-node2 19285 1727203911.58378: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 19285 1727203911.58380: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203911.58392: getting variables 19285 1727203911.58394: in VariableManager get_vars() 19285 1727203911.58430: Calling all_inventory to load vars for managed-node2 19285 1727203911.58433: Calling groups_inventory to load vars for managed-node2 19285 1727203911.58435: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203911.58445: Calling all_plugins_play to load vars for managed-node2 19285 1727203911.58447: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203911.58450: Calling groups_plugins_play to load vars for managed-node2 19285 1727203911.59354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203911.60965: done with get_vars() 19285 1727203911.60989: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 19285 1727203911.61070: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:51:51 -0400 (0:00:00.076) 0:00:10.685 ***** 19285 1727203911.61102: entering _queue_task() for managed-node2/yum 19285 1727203911.61104: Creating lock for yum 19285 1727203911.61436: worker is 1 (out of 1 available) 19285 1727203911.61449: exiting _queue_task() for managed-node2/yum 19285 1727203911.61464: done queuing things up, now waiting for results queue to drain 19285 1727203911.61465: waiting for pending results... 19285 1727203911.61894: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 19285 1727203911.61898: in run() - task 028d2410-947f-f31b-fb3f-00000000001a 19285 1727203911.61901: variable 'ansible_search_path' from source: unknown 19285 1727203911.61903: variable 'ansible_search_path' from source: unknown 19285 1727203911.61905: calling self._execute() 19285 1727203911.61973: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203911.61991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203911.62005: variable 'omit' from source: magic vars 19285 1727203911.62374: variable 'ansible_distribution_major_version' from source: facts 19285 1727203911.62393: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203911.62573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203911.64798: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203911.64876: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203911.64923: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203911.65080: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203911.65083: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203911.65086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203911.65116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203911.65146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203911.65196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203911.65218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203911.65325: variable 'ansible_distribution_major_version' from source: facts 19285 1727203911.65420: Evaluated conditional (ansible_distribution_major_version | int < 8): False 19285 1727203911.65423: when evaluation is False, skipping this task 19285 1727203911.65426: _execute() done 19285 1727203911.65428: dumping result to json 19285 1727203911.65430: done dumping result, returning 19285 1727203911.65433: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [028d2410-947f-f31b-fb3f-00000000001a] 19285 1727203911.65436: sending task result for task 028d2410-947f-f31b-fb3f-00000000001a 19285 1727203911.65513: done sending task result for task 028d2410-947f-f31b-fb3f-00000000001a 19285 1727203911.65516: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 19285 1727203911.65578: no more pending results, returning what we have 19285 1727203911.65583: results queue empty 19285 1727203911.65584: checking for any_errors_fatal 19285 1727203911.65589: done checking for any_errors_fatal 19285 1727203911.65590: checking for max_fail_percentage 19285 1727203911.65592: done checking for max_fail_percentage 19285 1727203911.65593: checking to see if all hosts have failed and the running result is not ok 19285 1727203911.65594: done checking to see if all hosts have failed 19285 1727203911.65594: getting the remaining hosts for this loop 19285 1727203911.65596: done getting the remaining hosts for this loop 19285 1727203911.65600: getting the next task for host managed-node2 19285 1727203911.65608: done getting next task for host managed-node2 19285 1727203911.65612: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 19285 1727203911.65614: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203911.65628: getting variables 19285 1727203911.65630: in VariableManager get_vars() 19285 1727203911.65673: Calling all_inventory to load vars for managed-node2 19285 1727203911.65781: Calling groups_inventory to load vars for managed-node2 19285 1727203911.65785: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203911.65797: Calling all_plugins_play to load vars for managed-node2 19285 1727203911.65800: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203911.65803: Calling groups_plugins_play to load vars for managed-node2 19285 1727203911.66979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203911.67832: done with get_vars() 19285 1727203911.67849: done getting variables 19285 1727203911.67897: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:51:51 -0400 (0:00:00.068) 0:00:10.753 ***** 19285 1727203911.67919: entering _queue_task() for managed-node2/fail 19285 1727203911.68156: worker is 1 (out of 1 available) 19285 1727203911.68176: exiting _queue_task() for managed-node2/fail 19285 1727203911.68189: done queuing things up, now waiting for results queue to drain 19285 1727203911.68190: waiting for pending results... 19285 1727203911.68364: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 19285 1727203911.68701: in run() - task 028d2410-947f-f31b-fb3f-00000000001b 19285 1727203911.68705: variable 'ansible_search_path' from source: unknown 19285 1727203911.68708: variable 'ansible_search_path' from source: unknown 19285 1727203911.68710: calling self._execute() 19285 1727203911.68713: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203911.68715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203911.68717: variable 'omit' from source: magic vars 19285 1727203911.69089: variable 'ansible_distribution_major_version' from source: facts 19285 1727203911.69107: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203911.69339: variable '__network_wireless_connections_defined' from source: role '' defaults 19285 1727203911.69545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203911.71038: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203911.71085: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203911.71115: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203911.71140: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203911.71159: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203911.71224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203911.71243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203911.71261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203911.71290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203911.71301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203911.71337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203911.71353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203911.71373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203911.71399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203911.71409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203911.71437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203911.71457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203911.71474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203911.71500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203911.71510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203911.71631: variable 'network_connections' from source: play vars 19285 1727203911.71663: variable 'interface' from source: set_fact 19285 1727203911.71741: variable 'interface' from source: set_fact 19285 1727203911.71745: variable 'interface' from source: set_fact 19285 1727203911.71880: variable 'interface' from source: set_fact 19285 1727203911.71883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203911.72328: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203911.72369: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203911.72404: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203911.72452: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203911.72508: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203911.72544: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203911.72586: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203911.72617: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203911.72769: variable '__network_team_connections_defined' from source: role '' defaults 19285 1727203911.72963: variable 'network_connections' from source: play vars 19285 1727203911.72982: variable 'interface' from source: set_fact 19285 1727203911.73073: variable 'interface' from source: set_fact 19285 1727203911.73105: variable 'interface' from source: set_fact 19285 1727203911.73149: variable 'interface' from source: set_fact 19285 1727203911.73191: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 19285 1727203911.73194: when evaluation is False, skipping this task 19285 1727203911.73203: _execute() done 19285 1727203911.73205: dumping result to json 19285 1727203911.73211: done dumping result, returning 19285 1727203911.73218: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [028d2410-947f-f31b-fb3f-00000000001b] 19285 1727203911.73229: sending task result for task 028d2410-947f-f31b-fb3f-00000000001b 19285 1727203911.73321: done sending task result for task 028d2410-947f-f31b-fb3f-00000000001b 19285 1727203911.73324: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 19285 1727203911.73376: no more pending results, returning what we have 19285 1727203911.73380: results queue empty 19285 1727203911.73381: checking for any_errors_fatal 19285 1727203911.73386: done checking for any_errors_fatal 19285 1727203911.73386: checking for max_fail_percentage 19285 1727203911.73388: done checking for max_fail_percentage 19285 1727203911.73389: checking to see if all hosts have failed and the running result is not ok 19285 1727203911.73390: done checking to see if all hosts have failed 19285 1727203911.73390: getting the remaining hosts for this loop 19285 1727203911.73392: done getting the remaining hosts for this loop 19285 1727203911.73395: getting the next task for host managed-node2 19285 1727203911.73402: done getting next task for host managed-node2 19285 1727203911.73405: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 19285 1727203911.73407: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203911.73421: getting variables 19285 1727203911.73422: in VariableManager get_vars() 19285 1727203911.73458: Calling all_inventory to load vars for managed-node2 19285 1727203911.73463: Calling groups_inventory to load vars for managed-node2 19285 1727203911.73465: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203911.73475: Calling all_plugins_play to load vars for managed-node2 19285 1727203911.73479: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203911.73482: Calling groups_plugins_play to load vars for managed-node2 19285 1727203911.74381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203911.75341: done with get_vars() 19285 1727203911.75362: done getting variables 19285 1727203911.75419: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:51:51 -0400 (0:00:00.075) 0:00:10.828 ***** 19285 1727203911.75453: entering _queue_task() for managed-node2/package 19285 1727203911.75821: worker is 1 (out of 1 available) 19285 1727203911.75835: exiting _queue_task() for managed-node2/package 19285 1727203911.75863: done queuing things up, now waiting for results queue to drain 19285 1727203911.75865: waiting for pending results... 19285 1727203911.76005: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 19285 1727203911.76181: in run() - task 028d2410-947f-f31b-fb3f-00000000001c 19285 1727203911.76184: variable 'ansible_search_path' from source: unknown 19285 1727203911.76186: variable 'ansible_search_path' from source: unknown 19285 1727203911.76189: calling self._execute() 19285 1727203911.76238: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203911.76249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203911.76265: variable 'omit' from source: magic vars 19285 1727203911.76620: variable 'ansible_distribution_major_version' from source: facts 19285 1727203911.76636: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203911.76821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203911.77090: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203911.77140: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203911.77177: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203911.77244: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203911.77300: variable 'network_packages' from source: role '' defaults 19285 1727203911.77373: variable '__network_provider_setup' from source: role '' defaults 19285 1727203911.77384: variable '__network_service_name_default_nm' from source: role '' defaults 19285 1727203911.77438: variable '__network_service_name_default_nm' from source: role '' defaults 19285 1727203911.77445: variable '__network_packages_default_nm' from source: role '' defaults 19285 1727203911.77493: variable '__network_packages_default_nm' from source: role '' defaults 19285 1727203911.77603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203911.78925: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203911.78974: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203911.79005: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203911.79028: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203911.79047: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203911.79109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203911.79129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203911.79159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203911.79369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203911.79372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203911.79376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203911.79379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203911.79381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203911.79384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203911.79386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203911.79617: variable '__network_packages_default_gobject_packages' from source: role '' defaults 19285 1727203911.79734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203911.79764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203911.79796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203911.79913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203911.79916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203911.80001: variable 'ansible_python' from source: facts 19285 1727203911.80045: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 19285 1727203911.80159: variable '__network_wpa_supplicant_required' from source: role '' defaults 19285 1727203911.80230: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 19285 1727203911.80325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203911.80346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203911.80365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203911.80390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203911.80400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203911.80431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203911.80454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203911.80470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203911.80498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203911.80513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203911.80609: variable 'network_connections' from source: play vars 19285 1727203911.80613: variable 'interface' from source: set_fact 19285 1727203911.80684: variable 'interface' from source: set_fact 19285 1727203911.80693: variable 'interface' from source: set_fact 19285 1727203911.80761: variable 'interface' from source: set_fact 19285 1727203911.80813: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203911.80832: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203911.80852: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203911.80874: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203911.80913: variable '__network_wireless_connections_defined' from source: role '' defaults 19285 1727203911.81087: variable 'network_connections' from source: play vars 19285 1727203911.81090: variable 'interface' from source: set_fact 19285 1727203911.81163: variable 'interface' from source: set_fact 19285 1727203911.81167: variable 'interface' from source: set_fact 19285 1727203911.81239: variable 'interface' from source: set_fact 19285 1727203911.81277: variable '__network_packages_default_wireless' from source: role '' defaults 19285 1727203911.81330: variable '__network_wireless_connections_defined' from source: role '' defaults 19285 1727203911.81525: variable 'network_connections' from source: play vars 19285 1727203911.81528: variable 'interface' from source: set_fact 19285 1727203911.81577: variable 'interface' from source: set_fact 19285 1727203911.81583: variable 'interface' from source: set_fact 19285 1727203911.81626: variable 'interface' from source: set_fact 19285 1727203911.81643: variable '__network_packages_default_team' from source: role '' defaults 19285 1727203911.81700: variable '__network_team_connections_defined' from source: role '' defaults 19285 1727203911.81888: variable 'network_connections' from source: play vars 19285 1727203911.81893: variable 'interface' from source: set_fact 19285 1727203911.81937: variable 'interface' from source: set_fact 19285 1727203911.81943: variable 'interface' from source: set_fact 19285 1727203911.81990: variable 'interface' from source: set_fact 19285 1727203911.82034: variable '__network_service_name_default_initscripts' from source: role '' defaults 19285 1727203911.82077: variable '__network_service_name_default_initscripts' from source: role '' defaults 19285 1727203911.82084: variable '__network_packages_default_initscripts' from source: role '' defaults 19285 1727203911.82126: variable '__network_packages_default_initscripts' from source: role '' defaults 19285 1727203911.82273: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 19285 1727203911.82567: variable 'network_connections' from source: play vars 19285 1727203911.82571: variable 'interface' from source: set_fact 19285 1727203911.82614: variable 'interface' from source: set_fact 19285 1727203911.82620: variable 'interface' from source: set_fact 19285 1727203911.82664: variable 'interface' from source: set_fact 19285 1727203911.82677: variable 'ansible_distribution' from source: facts 19285 1727203911.82680: variable '__network_rh_distros' from source: role '' defaults 19285 1727203911.82686: variable 'ansible_distribution_major_version' from source: facts 19285 1727203911.82708: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 19285 1727203911.82814: variable 'ansible_distribution' from source: facts 19285 1727203911.82818: variable '__network_rh_distros' from source: role '' defaults 19285 1727203911.82822: variable 'ansible_distribution_major_version' from source: facts 19285 1727203911.82831: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 19285 1727203911.82936: variable 'ansible_distribution' from source: facts 19285 1727203911.82939: variable '__network_rh_distros' from source: role '' defaults 19285 1727203911.82942: variable 'ansible_distribution_major_version' from source: facts 19285 1727203911.82971: variable 'network_provider' from source: set_fact 19285 1727203911.82984: variable 'ansible_facts' from source: unknown 19285 1727203911.83402: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 19285 1727203911.83406: when evaluation is False, skipping this task 19285 1727203911.83410: _execute() done 19285 1727203911.83412: dumping result to json 19285 1727203911.83414: done dumping result, returning 19285 1727203911.83421: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [028d2410-947f-f31b-fb3f-00000000001c] 19285 1727203911.83426: sending task result for task 028d2410-947f-f31b-fb3f-00000000001c 19285 1727203911.83515: done sending task result for task 028d2410-947f-f31b-fb3f-00000000001c 19285 1727203911.83518: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 19285 1727203911.83571: no more pending results, returning what we have 19285 1727203911.83574: results queue empty 19285 1727203911.83577: checking for any_errors_fatal 19285 1727203911.83582: done checking for any_errors_fatal 19285 1727203911.83583: checking for max_fail_percentage 19285 1727203911.83585: done checking for max_fail_percentage 19285 1727203911.83585: checking to see if all hosts have failed and the running result is not ok 19285 1727203911.83586: done checking to see if all hosts have failed 19285 1727203911.83587: getting the remaining hosts for this loop 19285 1727203911.83589: done getting the remaining hosts for this loop 19285 1727203911.83592: getting the next task for host managed-node2 19285 1727203911.83599: done getting next task for host managed-node2 19285 1727203911.83603: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 19285 1727203911.83604: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203911.83616: getting variables 19285 1727203911.83618: in VariableManager get_vars() 19285 1727203911.83655: Calling all_inventory to load vars for managed-node2 19285 1727203911.83657: Calling groups_inventory to load vars for managed-node2 19285 1727203911.83662: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203911.83683: Calling all_plugins_play to load vars for managed-node2 19285 1727203911.83686: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203911.83690: Calling groups_plugins_play to load vars for managed-node2 19285 1727203911.84495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203911.85352: done with get_vars() 19285 1727203911.85371: done getting variables 19285 1727203911.85417: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:51:51 -0400 (0:00:00.099) 0:00:10.928 ***** 19285 1727203911.85438: entering _queue_task() for managed-node2/package 19285 1727203911.85673: worker is 1 (out of 1 available) 19285 1727203911.85689: exiting _queue_task() for managed-node2/package 19285 1727203911.85702: done queuing things up, now waiting for results queue to drain 19285 1727203911.85703: waiting for pending results... 19285 1727203911.85864: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 19285 1727203911.85935: in run() - task 028d2410-947f-f31b-fb3f-00000000001d 19285 1727203911.85941: variable 'ansible_search_path' from source: unknown 19285 1727203911.85944: variable 'ansible_search_path' from source: unknown 19285 1727203911.85978: calling self._execute() 19285 1727203911.86047: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203911.86051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203911.86061: variable 'omit' from source: magic vars 19285 1727203911.86333: variable 'ansible_distribution_major_version' from source: facts 19285 1727203911.86342: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203911.86427: variable 'network_state' from source: role '' defaults 19285 1727203911.86436: Evaluated conditional (network_state != {}): False 19285 1727203911.86438: when evaluation is False, skipping this task 19285 1727203911.86441: _execute() done 19285 1727203911.86443: dumping result to json 19285 1727203911.86446: done dumping result, returning 19285 1727203911.86453: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [028d2410-947f-f31b-fb3f-00000000001d] 19285 1727203911.86458: sending task result for task 028d2410-947f-f31b-fb3f-00000000001d 19285 1727203911.86545: done sending task result for task 028d2410-947f-f31b-fb3f-00000000001d 19285 1727203911.86548: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19285 1727203911.86621: no more pending results, returning what we have 19285 1727203911.86625: results queue empty 19285 1727203911.86626: checking for any_errors_fatal 19285 1727203911.86633: done checking for any_errors_fatal 19285 1727203911.86633: checking for max_fail_percentage 19285 1727203911.86635: done checking for max_fail_percentage 19285 1727203911.86635: checking to see if all hosts have failed and the running result is not ok 19285 1727203911.86636: done checking to see if all hosts have failed 19285 1727203911.86637: getting the remaining hosts for this loop 19285 1727203911.86639: done getting the remaining hosts for this loop 19285 1727203911.86642: getting the next task for host managed-node2 19285 1727203911.86648: done getting next task for host managed-node2 19285 1727203911.86651: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 19285 1727203911.86654: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203911.86668: getting variables 19285 1727203911.86669: in VariableManager get_vars() 19285 1727203911.86701: Calling all_inventory to load vars for managed-node2 19285 1727203911.86703: Calling groups_inventory to load vars for managed-node2 19285 1727203911.86705: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203911.86714: Calling all_plugins_play to load vars for managed-node2 19285 1727203911.86716: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203911.86718: Calling groups_plugins_play to load vars for managed-node2 19285 1727203911.90073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203911.90907: done with get_vars() 19285 1727203911.90923: done getting variables 19285 1727203911.90956: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:51:51 -0400 (0:00:00.055) 0:00:10.984 ***** 19285 1727203911.90975: entering _queue_task() for managed-node2/package 19285 1727203911.91221: worker is 1 (out of 1 available) 19285 1727203911.91235: exiting _queue_task() for managed-node2/package 19285 1727203911.91247: done queuing things up, now waiting for results queue to drain 19285 1727203911.91250: waiting for pending results... 19285 1727203911.91418: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 19285 1727203911.91491: in run() - task 028d2410-947f-f31b-fb3f-00000000001e 19285 1727203911.91501: variable 'ansible_search_path' from source: unknown 19285 1727203911.91506: variable 'ansible_search_path' from source: unknown 19285 1727203911.91535: calling self._execute() 19285 1727203911.91610: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203911.91614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203911.91621: variable 'omit' from source: magic vars 19285 1727203911.91892: variable 'ansible_distribution_major_version' from source: facts 19285 1727203911.91902: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203911.91985: variable 'network_state' from source: role '' defaults 19285 1727203911.91993: Evaluated conditional (network_state != {}): False 19285 1727203911.91996: when evaluation is False, skipping this task 19285 1727203911.91998: _execute() done 19285 1727203911.92001: dumping result to json 19285 1727203911.92003: done dumping result, returning 19285 1727203911.92012: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [028d2410-947f-f31b-fb3f-00000000001e] 19285 1727203911.92015: sending task result for task 028d2410-947f-f31b-fb3f-00000000001e 19285 1727203911.92104: done sending task result for task 028d2410-947f-f31b-fb3f-00000000001e 19285 1727203911.92107: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19285 1727203911.92168: no more pending results, returning what we have 19285 1727203911.92172: results queue empty 19285 1727203911.92173: checking for any_errors_fatal 19285 1727203911.92182: done checking for any_errors_fatal 19285 1727203911.92183: checking for max_fail_percentage 19285 1727203911.92184: done checking for max_fail_percentage 19285 1727203911.92185: checking to see if all hosts have failed and the running result is not ok 19285 1727203911.92186: done checking to see if all hosts have failed 19285 1727203911.92187: getting the remaining hosts for this loop 19285 1727203911.92188: done getting the remaining hosts for this loop 19285 1727203911.92192: getting the next task for host managed-node2 19285 1727203911.92198: done getting next task for host managed-node2 19285 1727203911.92202: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 19285 1727203911.92203: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203911.92218: getting variables 19285 1727203911.92219: in VariableManager get_vars() 19285 1727203911.92250: Calling all_inventory to load vars for managed-node2 19285 1727203911.92253: Calling groups_inventory to load vars for managed-node2 19285 1727203911.92255: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203911.92263: Calling all_plugins_play to load vars for managed-node2 19285 1727203911.92265: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203911.92267: Calling groups_plugins_play to load vars for managed-node2 19285 1727203911.93004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203911.93866: done with get_vars() 19285 1727203911.93883: done getting variables 19285 1727203911.93951: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:51:51 -0400 (0:00:00.029) 0:00:11.014 ***** 19285 1727203911.93973: entering _queue_task() for managed-node2/service 19285 1727203911.93974: Creating lock for service 19285 1727203911.94197: worker is 1 (out of 1 available) 19285 1727203911.94211: exiting _queue_task() for managed-node2/service 19285 1727203911.94223: done queuing things up, now waiting for results queue to drain 19285 1727203911.94224: waiting for pending results... 19285 1727203911.94387: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 19285 1727203911.94458: in run() - task 028d2410-947f-f31b-fb3f-00000000001f 19285 1727203911.94467: variable 'ansible_search_path' from source: unknown 19285 1727203911.94470: variable 'ansible_search_path' from source: unknown 19285 1727203911.94499: calling self._execute() 19285 1727203911.94569: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203911.94574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203911.94585: variable 'omit' from source: magic vars 19285 1727203911.94845: variable 'ansible_distribution_major_version' from source: facts 19285 1727203911.94854: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203911.94939: variable '__network_wireless_connections_defined' from source: role '' defaults 19285 1727203911.95071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203911.96729: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203911.96785: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203911.96812: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203911.96836: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203911.96858: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203911.96919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203911.96939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203911.96957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203911.96990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203911.97001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203911.97036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203911.97053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203911.97077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203911.97103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203911.97114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203911.97142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203911.97157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203911.97177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203911.97206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203911.97216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203911.97332: variable 'network_connections' from source: play vars 19285 1727203911.97341: variable 'interface' from source: set_fact 19285 1727203911.97398: variable 'interface' from source: set_fact 19285 1727203911.97406: variable 'interface' from source: set_fact 19285 1727203911.97447: variable 'interface' from source: set_fact 19285 1727203911.97501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203911.97604: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203911.97631: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203911.97664: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203911.97686: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203911.97717: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203911.97734: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203911.97752: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203911.97770: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203911.97816: variable '__network_team_connections_defined' from source: role '' defaults 19285 1727203911.97966: variable 'network_connections' from source: play vars 19285 1727203911.97970: variable 'interface' from source: set_fact 19285 1727203911.98013: variable 'interface' from source: set_fact 19285 1727203911.98018: variable 'interface' from source: set_fact 19285 1727203911.98063: variable 'interface' from source: set_fact 19285 1727203911.98087: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 19285 1727203911.98090: when evaluation is False, skipping this task 19285 1727203911.98093: _execute() done 19285 1727203911.98095: dumping result to json 19285 1727203911.98097: done dumping result, returning 19285 1727203911.98104: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [028d2410-947f-f31b-fb3f-00000000001f] 19285 1727203911.98115: sending task result for task 028d2410-947f-f31b-fb3f-00000000001f 19285 1727203911.98195: done sending task result for task 028d2410-947f-f31b-fb3f-00000000001f 19285 1727203911.98197: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 19285 1727203911.98240: no more pending results, returning what we have 19285 1727203911.98244: results queue empty 19285 1727203911.98245: checking for any_errors_fatal 19285 1727203911.98252: done checking for any_errors_fatal 19285 1727203911.98253: checking for max_fail_percentage 19285 1727203911.98255: done checking for max_fail_percentage 19285 1727203911.98256: checking to see if all hosts have failed and the running result is not ok 19285 1727203911.98257: done checking to see if all hosts have failed 19285 1727203911.98257: getting the remaining hosts for this loop 19285 1727203911.98261: done getting the remaining hosts for this loop 19285 1727203911.98265: getting the next task for host managed-node2 19285 1727203911.98271: done getting next task for host managed-node2 19285 1727203911.98276: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 19285 1727203911.98278: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203911.98290: getting variables 19285 1727203911.98291: in VariableManager get_vars() 19285 1727203911.98328: Calling all_inventory to load vars for managed-node2 19285 1727203911.98331: Calling groups_inventory to load vars for managed-node2 19285 1727203911.98334: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203911.98342: Calling all_plugins_play to load vars for managed-node2 19285 1727203911.98345: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203911.98347: Calling groups_plugins_play to load vars for managed-node2 19285 1727203911.99224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203912.00084: done with get_vars() 19285 1727203912.00097: done getting variables 19285 1727203912.00139: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:51:51 -0400 (0:00:00.061) 0:00:11.076 ***** 19285 1727203912.00159: entering _queue_task() for managed-node2/service 19285 1727203912.00379: worker is 1 (out of 1 available) 19285 1727203912.00393: exiting _queue_task() for managed-node2/service 19285 1727203912.00404: done queuing things up, now waiting for results queue to drain 19285 1727203912.00406: waiting for pending results... 19285 1727203912.00563: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 19285 1727203912.00623: in run() - task 028d2410-947f-f31b-fb3f-000000000020 19285 1727203912.00637: variable 'ansible_search_path' from source: unknown 19285 1727203912.00643: variable 'ansible_search_path' from source: unknown 19285 1727203912.00677: calling self._execute() 19285 1727203912.00752: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203912.00756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203912.00768: variable 'omit' from source: magic vars 19285 1727203912.01056: variable 'ansible_distribution_major_version' from source: facts 19285 1727203912.01073: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203912.01173: variable 'network_provider' from source: set_fact 19285 1727203912.01178: variable 'network_state' from source: role '' defaults 19285 1727203912.01188: Evaluated conditional (network_provider == "nm" or network_state != {}): True 19285 1727203912.01191: variable 'omit' from source: magic vars 19285 1727203912.01217: variable 'omit' from source: magic vars 19285 1727203912.01236: variable 'network_service_name' from source: role '' defaults 19285 1727203912.01290: variable 'network_service_name' from source: role '' defaults 19285 1727203912.01362: variable '__network_provider_setup' from source: role '' defaults 19285 1727203912.01367: variable '__network_service_name_default_nm' from source: role '' defaults 19285 1727203912.01412: variable '__network_service_name_default_nm' from source: role '' defaults 19285 1727203912.01419: variable '__network_packages_default_nm' from source: role '' defaults 19285 1727203912.01464: variable '__network_packages_default_nm' from source: role '' defaults 19285 1727203912.01606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203912.03181: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203912.03189: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203912.03231: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203912.03270: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203912.03303: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203912.03384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203912.03418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203912.03448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203912.03493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203912.03512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203912.03556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203912.03587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203912.03614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203912.03656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203912.03780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203912.03893: variable '__network_packages_default_gobject_packages' from source: role '' defaults 19285 1727203912.03989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203912.04005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203912.04023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203912.04048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203912.04062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203912.04119: variable 'ansible_python' from source: facts 19285 1727203912.04135: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 19285 1727203912.04193: variable '__network_wpa_supplicant_required' from source: role '' defaults 19285 1727203912.04243: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 19285 1727203912.04326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203912.04343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203912.04362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203912.04390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203912.04400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203912.04431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203912.04456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203912.04476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203912.04505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203912.04515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203912.04604: variable 'network_connections' from source: play vars 19285 1727203912.04607: variable 'interface' from source: set_fact 19285 1727203912.04661: variable 'interface' from source: set_fact 19285 1727203912.04668: variable 'interface' from source: set_fact 19285 1727203912.04722: variable 'interface' from source: set_fact 19285 1727203912.04791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203912.04916: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203912.04953: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203912.04990: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203912.05019: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203912.05081: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203912.05084: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203912.05107: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203912.05128: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203912.05165: variable '__network_wireless_connections_defined' from source: role '' defaults 19285 1727203912.05334: variable 'network_connections' from source: play vars 19285 1727203912.05339: variable 'interface' from source: set_fact 19285 1727203912.05395: variable 'interface' from source: set_fact 19285 1727203912.05404: variable 'interface' from source: set_fact 19285 1727203912.05452: variable 'interface' from source: set_fact 19285 1727203912.05491: variable '__network_packages_default_wireless' from source: role '' defaults 19285 1727203912.05542: variable '__network_wireless_connections_defined' from source: role '' defaults 19285 1727203912.05723: variable 'network_connections' from source: play vars 19285 1727203912.05727: variable 'interface' from source: set_fact 19285 1727203912.05777: variable 'interface' from source: set_fact 19285 1727203912.05783: variable 'interface' from source: set_fact 19285 1727203912.05833: variable 'interface' from source: set_fact 19285 1727203912.05851: variable '__network_packages_default_team' from source: role '' defaults 19285 1727203912.05912: variable '__network_team_connections_defined' from source: role '' defaults 19285 1727203912.06083: variable 'network_connections' from source: play vars 19285 1727203912.06086: variable 'interface' from source: set_fact 19285 1727203912.06136: variable 'interface' from source: set_fact 19285 1727203912.06141: variable 'interface' from source: set_fact 19285 1727203912.06191: variable 'interface' from source: set_fact 19285 1727203912.06236: variable '__network_service_name_default_initscripts' from source: role '' defaults 19285 1727203912.06277: variable '__network_service_name_default_initscripts' from source: role '' defaults 19285 1727203912.06282: variable '__network_packages_default_initscripts' from source: role '' defaults 19285 1727203912.06323: variable '__network_packages_default_initscripts' from source: role '' defaults 19285 1727203912.06459: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 19285 1727203912.07181: variable 'network_connections' from source: play vars 19285 1727203912.07184: variable 'interface' from source: set_fact 19285 1727203912.07186: variable 'interface' from source: set_fact 19285 1727203912.07188: variable 'interface' from source: set_fact 19285 1727203912.07228: variable 'interface' from source: set_fact 19285 1727203912.07246: variable 'ansible_distribution' from source: facts 19285 1727203912.07258: variable '__network_rh_distros' from source: role '' defaults 19285 1727203912.07270: variable 'ansible_distribution_major_version' from source: facts 19285 1727203912.07305: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 19285 1727203912.07468: variable 'ansible_distribution' from source: facts 19285 1727203912.07478: variable '__network_rh_distros' from source: role '' defaults 19285 1727203912.07493: variable 'ansible_distribution_major_version' from source: facts 19285 1727203912.07524: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 19285 1727203912.07679: variable 'ansible_distribution' from source: facts 19285 1727203912.07691: variable '__network_rh_distros' from source: role '' defaults 19285 1727203912.07706: variable 'ansible_distribution_major_version' from source: facts 19285 1727203912.07750: variable 'network_provider' from source: set_fact 19285 1727203912.07789: variable 'omit' from source: magic vars 19285 1727203912.07835: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203912.07880: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203912.07912: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203912.07939: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203912.07980: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203912.07998: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203912.08004: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203912.08010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203912.08120: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203912.08181: Set connection var ansible_pipelining to False 19285 1727203912.08187: Set connection var ansible_timeout to 10 19285 1727203912.08190: Set connection var ansible_shell_type to sh 19285 1727203912.08192: Set connection var ansible_shell_executable to /bin/sh 19285 1727203912.08194: Set connection var ansible_connection to ssh 19285 1727203912.08220: variable 'ansible_shell_executable' from source: unknown 19285 1727203912.08230: variable 'ansible_connection' from source: unknown 19285 1727203912.08235: variable 'ansible_module_compression' from source: unknown 19285 1727203912.08241: variable 'ansible_shell_type' from source: unknown 19285 1727203912.08244: variable 'ansible_shell_executable' from source: unknown 19285 1727203912.08246: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203912.08253: variable 'ansible_pipelining' from source: unknown 19285 1727203912.08255: variable 'ansible_timeout' from source: unknown 19285 1727203912.08257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203912.08381: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203912.08385: variable 'omit' from source: magic vars 19285 1727203912.08387: starting attempt loop 19285 1727203912.08389: running the handler 19285 1727203912.08580: variable 'ansible_facts' from source: unknown 19285 1727203912.09161: _low_level_execute_command(): starting 19285 1727203912.09173: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203912.09838: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203912.09853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203912.09887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 19285 1727203912.09900: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 19285 1727203912.09910: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19285 1727203912.09994: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203912.10015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203912.10122: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203912.11815: stdout chunk (state=3): >>>/root <<< 19285 1727203912.12136: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203912.12147: stdout chunk (state=3): >>><<< 19285 1727203912.12159: stderr chunk (state=3): >>><<< 19285 1727203912.12184: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203912.12200: _low_level_execute_command(): starting 19285 1727203912.12209: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203912.1219008-20457-81494819567706 `" && echo ansible-tmp-1727203912.1219008-20457-81494819567706="` echo /root/.ansible/tmp/ansible-tmp-1727203912.1219008-20457-81494819567706 `" ) && sleep 0' 19285 1727203912.12799: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203912.12812: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203912.12826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203912.12842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203912.12941: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203912.12954: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203912.12978: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203912.13071: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203912.14979: stdout chunk (state=3): >>>ansible-tmp-1727203912.1219008-20457-81494819567706=/root/.ansible/tmp/ansible-tmp-1727203912.1219008-20457-81494819567706 <<< 19285 1727203912.15139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203912.15145: stdout chunk (state=3): >>><<< 19285 1727203912.15147: stderr chunk (state=3): >>><<< 19285 1727203912.15166: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203912.1219008-20457-81494819567706=/root/.ansible/tmp/ansible-tmp-1727203912.1219008-20457-81494819567706 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203912.15221: variable 'ansible_module_compression' from source: unknown 19285 1727203912.15288: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 19285 1727203912.15299: ANSIBALLZ: Acquiring lock 19285 1727203912.15306: ANSIBALLZ: Lock acquired: 140487240913488 19285 1727203912.15381: ANSIBALLZ: Creating module 19285 1727203912.51568: ANSIBALLZ: Writing module into payload 19285 1727203912.51767: ANSIBALLZ: Writing module 19285 1727203912.51804: ANSIBALLZ: Renaming module 19285 1727203912.51826: ANSIBALLZ: Done creating module 19285 1727203912.51854: variable 'ansible_facts' from source: unknown 19285 1727203912.52078: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203912.1219008-20457-81494819567706/AnsiballZ_systemd.py 19285 1727203912.52285: Sending initial data 19285 1727203912.52288: Sent initial data (155 bytes) 19285 1727203912.52919: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203912.52934: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203912.52993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203912.53061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203912.53082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203912.53107: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203912.53234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203912.54928: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203912.55018: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203912.55100: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmp2oy4d0u3 /root/.ansible/tmp/ansible-tmp-1727203912.1219008-20457-81494819567706/AnsiballZ_systemd.py <<< 19285 1727203912.55105: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203912.1219008-20457-81494819567706/AnsiballZ_systemd.py" <<< 19285 1727203912.55200: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmp2oy4d0u3" to remote "/root/.ansible/tmp/ansible-tmp-1727203912.1219008-20457-81494819567706/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203912.1219008-20457-81494819567706/AnsiballZ_systemd.py" <<< 19285 1727203912.57465: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203912.57592: stderr chunk (state=3): >>><<< 19285 1727203912.57595: stdout chunk (state=3): >>><<< 19285 1727203912.57598: done transferring module to remote 19285 1727203912.57600: _low_level_execute_command(): starting 19285 1727203912.57603: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203912.1219008-20457-81494819567706/ /root/.ansible/tmp/ansible-tmp-1727203912.1219008-20457-81494819567706/AnsiballZ_systemd.py && sleep 0' 19285 1727203912.58191: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203912.58248: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203912.58263: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203912.58279: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203912.58376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203912.60381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203912.60384: stdout chunk (state=3): >>><<< 19285 1727203912.60387: stderr chunk (state=3): >>><<< 19285 1727203912.60389: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203912.60391: _low_level_execute_command(): starting 19285 1727203912.60394: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203912.1219008-20457-81494819567706/AnsiballZ_systemd.py && sleep 0' 19285 1727203912.60991: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203912.61037: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203912.61059: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203912.61064: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203912.61166: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203912.90350: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "7081", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainStartTimestampMonotonic": "294798591", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainHandoffTimestampMonotonic": "294813549", "ExecMainPID": "7081", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4312", "MemoryCurrent": "4435968", "MemoryPeak": "7655424", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3300823040", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "589186000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredump<<< 19285 1727203912.90388: stdout chunk (state=3): >>>Receive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target cloud-init.service multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "sysin<<< 19285 1727203912.90397: stdout chunk (state=3): >>>it.target systemd-journald.socket basic.target network-pre.target system.slice cloud-init-local.service dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:13 EDT", "StateChangeTimestampMonotonic": "399463156", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveExitTimestampMonotonic": "294799297", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveEnterTimestampMonotonic": "294888092", "ActiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveExitTimestampMonotonic": "294768391", "InactiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveEnterTimestampMonotonic": "294795966", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ConditionTimestampMonotonic": "294797207", "AssertTimestamp": "Tue 2024-09-24 14:48:28 EDT", "AssertTimestampMonotonic": "294797210", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a167241d4c7945a58749ffeda353964d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 19285 1727203912.92316: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203912.92357: stderr chunk (state=3): >>><<< 19285 1727203912.92363: stdout chunk (state=3): >>><<< 19285 1727203912.92377: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "7081", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainStartTimestampMonotonic": "294798591", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainHandoffTimestampMonotonic": "294813549", "ExecMainPID": "7081", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4312", "MemoryCurrent": "4435968", "MemoryPeak": "7655424", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3300823040", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "589186000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target cloud-init.service multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "sysinit.target systemd-journald.socket basic.target network-pre.target system.slice cloud-init-local.service dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:13 EDT", "StateChangeTimestampMonotonic": "399463156", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveExitTimestampMonotonic": "294799297", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveEnterTimestampMonotonic": "294888092", "ActiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveExitTimestampMonotonic": "294768391", "InactiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveEnterTimestampMonotonic": "294795966", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ConditionTimestampMonotonic": "294797207", "AssertTimestamp": "Tue 2024-09-24 14:48:28 EDT", "AssertTimestampMonotonic": "294797210", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a167241d4c7945a58749ffeda353964d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203912.92548: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203912.1219008-20457-81494819567706/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203912.92584: _low_level_execute_command(): starting 19285 1727203912.92588: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203912.1219008-20457-81494819567706/ > /dev/null 2>&1 && sleep 0' 19285 1727203912.93085: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203912.93094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 19285 1727203912.93101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 19285 1727203912.93104: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203912.93137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203912.93204: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203912.93217: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203912.93299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203912.95204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203912.95224: stderr chunk (state=3): >>><<< 19285 1727203912.95227: stdout chunk (state=3): >>><<< 19285 1727203912.95240: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203912.95263: handler run complete 19285 1727203912.95308: attempt loop complete, returning result 19285 1727203912.95311: _execute() done 19285 1727203912.95314: dumping result to json 19285 1727203912.95325: done dumping result, returning 19285 1727203912.95334: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [028d2410-947f-f31b-fb3f-000000000020] 19285 1727203912.95351: sending task result for task 028d2410-947f-f31b-fb3f-000000000020 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19285 1727203912.95653: no more pending results, returning what we have 19285 1727203912.95656: results queue empty 19285 1727203912.95657: checking for any_errors_fatal 19285 1727203912.95664: done checking for any_errors_fatal 19285 1727203912.95665: checking for max_fail_percentage 19285 1727203912.95666: done checking for max_fail_percentage 19285 1727203912.95667: checking to see if all hosts have failed and the running result is not ok 19285 1727203912.95668: done checking to see if all hosts have failed 19285 1727203912.95668: getting the remaining hosts for this loop 19285 1727203912.95670: done getting the remaining hosts for this loop 19285 1727203912.95673: getting the next task for host managed-node2 19285 1727203912.95681: done getting next task for host managed-node2 19285 1727203912.95686: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 19285 1727203912.95687: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203912.95696: getting variables 19285 1727203912.95698: in VariableManager get_vars() 19285 1727203912.95729: Calling all_inventory to load vars for managed-node2 19285 1727203912.95732: Calling groups_inventory to load vars for managed-node2 19285 1727203912.95734: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203912.95746: Calling all_plugins_play to load vars for managed-node2 19285 1727203912.95748: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203912.95751: Calling groups_plugins_play to load vars for managed-node2 19285 1727203912.96288: done sending task result for task 028d2410-947f-f31b-fb3f-000000000020 19285 1727203912.96297: WORKER PROCESS EXITING 19285 1727203912.96921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203912.98171: done with get_vars() 19285 1727203912.98189: done getting variables 19285 1727203912.98235: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:51:52 -0400 (0:00:00.980) 0:00:12.057 ***** 19285 1727203912.98257: entering _queue_task() for managed-node2/service 19285 1727203912.98563: worker is 1 (out of 1 available) 19285 1727203912.98579: exiting _queue_task() for managed-node2/service 19285 1727203912.98592: done queuing things up, now waiting for results queue to drain 19285 1727203912.98593: waiting for pending results... 19285 1727203912.98798: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 19285 1727203912.98896: in run() - task 028d2410-947f-f31b-fb3f-000000000021 19285 1727203912.98901: variable 'ansible_search_path' from source: unknown 19285 1727203912.98903: variable 'ansible_search_path' from source: unknown 19285 1727203912.98935: calling self._execute() 19285 1727203912.99017: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203912.99027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203912.99030: variable 'omit' from source: magic vars 19285 1727203912.99315: variable 'ansible_distribution_major_version' from source: facts 19285 1727203912.99324: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203912.99406: variable 'network_provider' from source: set_fact 19285 1727203912.99410: Evaluated conditional (network_provider == "nm"): True 19285 1727203912.99474: variable '__network_wpa_supplicant_required' from source: role '' defaults 19285 1727203912.99538: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 19285 1727203912.99691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203913.01074: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203913.01121: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203913.01148: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203913.01174: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203913.01195: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203913.01265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203913.01286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203913.01304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203913.01333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203913.01345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203913.01378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203913.01395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203913.01411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203913.01440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203913.01448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203913.01480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203913.01497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203913.01513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203913.01536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203913.01547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203913.01643: variable 'network_connections' from source: play vars 19285 1727203913.01654: variable 'interface' from source: set_fact 19285 1727203913.01710: variable 'interface' from source: set_fact 19285 1727203913.01717: variable 'interface' from source: set_fact 19285 1727203913.01760: variable 'interface' from source: set_fact 19285 1727203913.01811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203913.01924: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203913.01949: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203913.01972: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203913.01998: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203913.02029: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203913.02044: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203913.02061: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203913.02082: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203913.02120: variable '__network_wireless_connections_defined' from source: role '' defaults 19285 1727203913.02282: variable 'network_connections' from source: play vars 19285 1727203913.02285: variable 'interface' from source: set_fact 19285 1727203913.02330: variable 'interface' from source: set_fact 19285 1727203913.02336: variable 'interface' from source: set_fact 19285 1727203913.02380: variable 'interface' from source: set_fact 19285 1727203913.02409: Evaluated conditional (__network_wpa_supplicant_required): False 19285 1727203913.02414: when evaluation is False, skipping this task 19285 1727203913.02416: _execute() done 19285 1727203913.02428: dumping result to json 19285 1727203913.02431: done dumping result, returning 19285 1727203913.02433: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [028d2410-947f-f31b-fb3f-000000000021] 19285 1727203913.02435: sending task result for task 028d2410-947f-f31b-fb3f-000000000021 19285 1727203913.02511: done sending task result for task 028d2410-947f-f31b-fb3f-000000000021 19285 1727203913.02514: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 19285 1727203913.02593: no more pending results, returning what we have 19285 1727203913.02597: results queue empty 19285 1727203913.02598: checking for any_errors_fatal 19285 1727203913.02623: done checking for any_errors_fatal 19285 1727203913.02624: checking for max_fail_percentage 19285 1727203913.02625: done checking for max_fail_percentage 19285 1727203913.02626: checking to see if all hosts have failed and the running result is not ok 19285 1727203913.02627: done checking to see if all hosts have failed 19285 1727203913.02628: getting the remaining hosts for this loop 19285 1727203913.02629: done getting the remaining hosts for this loop 19285 1727203913.02633: getting the next task for host managed-node2 19285 1727203913.02639: done getting next task for host managed-node2 19285 1727203913.02642: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 19285 1727203913.02644: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203913.02658: getting variables 19285 1727203913.02660: in VariableManager get_vars() 19285 1727203913.02692: Calling all_inventory to load vars for managed-node2 19285 1727203913.02695: Calling groups_inventory to load vars for managed-node2 19285 1727203913.02697: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203913.02705: Calling all_plugins_play to load vars for managed-node2 19285 1727203913.02707: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203913.02709: Calling groups_plugins_play to load vars for managed-node2 19285 1727203913.03448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203913.04292: done with get_vars() 19285 1727203913.04307: done getting variables 19285 1727203913.04347: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:51:53 -0400 (0:00:00.061) 0:00:12.118 ***** 19285 1727203913.04368: entering _queue_task() for managed-node2/service 19285 1727203913.04582: worker is 1 (out of 1 available) 19285 1727203913.04594: exiting _queue_task() for managed-node2/service 19285 1727203913.04606: done queuing things up, now waiting for results queue to drain 19285 1727203913.04607: waiting for pending results... 19285 1727203913.04770: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 19285 1727203913.04840: in run() - task 028d2410-947f-f31b-fb3f-000000000022 19285 1727203913.04849: variable 'ansible_search_path' from source: unknown 19285 1727203913.04852: variable 'ansible_search_path' from source: unknown 19285 1727203913.04883: calling self._execute() 19285 1727203913.04952: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203913.04956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203913.04967: variable 'omit' from source: magic vars 19285 1727203913.05225: variable 'ansible_distribution_major_version' from source: facts 19285 1727203913.05235: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203913.05315: variable 'network_provider' from source: set_fact 19285 1727203913.05318: Evaluated conditional (network_provider == "initscripts"): False 19285 1727203913.05321: when evaluation is False, skipping this task 19285 1727203913.05324: _execute() done 19285 1727203913.05326: dumping result to json 19285 1727203913.05329: done dumping result, returning 19285 1727203913.05336: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [028d2410-947f-f31b-fb3f-000000000022] 19285 1727203913.05341: sending task result for task 028d2410-947f-f31b-fb3f-000000000022 19285 1727203913.05423: done sending task result for task 028d2410-947f-f31b-fb3f-000000000022 19285 1727203913.05425: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19285 1727203913.05465: no more pending results, returning what we have 19285 1727203913.05468: results queue empty 19285 1727203913.05469: checking for any_errors_fatal 19285 1727203913.05481: done checking for any_errors_fatal 19285 1727203913.05482: checking for max_fail_percentage 19285 1727203913.05484: done checking for max_fail_percentage 19285 1727203913.05485: checking to see if all hosts have failed and the running result is not ok 19285 1727203913.05485: done checking to see if all hosts have failed 19285 1727203913.05486: getting the remaining hosts for this loop 19285 1727203913.05488: done getting the remaining hosts for this loop 19285 1727203913.05491: getting the next task for host managed-node2 19285 1727203913.05498: done getting next task for host managed-node2 19285 1727203913.05501: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 19285 1727203913.05503: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203913.05516: getting variables 19285 1727203913.05518: in VariableManager get_vars() 19285 1727203913.05546: Calling all_inventory to load vars for managed-node2 19285 1727203913.05548: Calling groups_inventory to load vars for managed-node2 19285 1727203913.05550: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203913.05557: Calling all_plugins_play to load vars for managed-node2 19285 1727203913.05560: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203913.05562: Calling groups_plugins_play to load vars for managed-node2 19285 1727203913.06408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203913.07244: done with get_vars() 19285 1727203913.07258: done getting variables 19285 1727203913.07299: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:51:53 -0400 (0:00:00.029) 0:00:12.147 ***** 19285 1727203913.07320: entering _queue_task() for managed-node2/copy 19285 1727203913.07524: worker is 1 (out of 1 available) 19285 1727203913.07538: exiting _queue_task() for managed-node2/copy 19285 1727203913.07550: done queuing things up, now waiting for results queue to drain 19285 1727203913.07551: waiting for pending results... 19285 1727203913.07713: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 19285 1727203913.07782: in run() - task 028d2410-947f-f31b-fb3f-000000000023 19285 1727203913.07797: variable 'ansible_search_path' from source: unknown 19285 1727203913.07801: variable 'ansible_search_path' from source: unknown 19285 1727203913.07853: calling self._execute() 19285 1727203913.07988: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203913.07992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203913.07995: variable 'omit' from source: magic vars 19285 1727203913.08361: variable 'ansible_distribution_major_version' from source: facts 19285 1727203913.08380: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203913.08503: variable 'network_provider' from source: set_fact 19285 1727203913.08514: Evaluated conditional (network_provider == "initscripts"): False 19285 1727203913.08524: when evaluation is False, skipping this task 19285 1727203913.08539: _execute() done 19285 1727203913.08545: dumping result to json 19285 1727203913.08551: done dumping result, returning 19285 1727203913.08581: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [028d2410-947f-f31b-fb3f-000000000023] 19285 1727203913.08583: sending task result for task 028d2410-947f-f31b-fb3f-000000000023 19285 1727203913.08824: done sending task result for task 028d2410-947f-f31b-fb3f-000000000023 19285 1727203913.08827: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 19285 1727203913.08874: no more pending results, returning what we have 19285 1727203913.08880: results queue empty 19285 1727203913.08881: checking for any_errors_fatal 19285 1727203913.08890: done checking for any_errors_fatal 19285 1727203913.08890: checking for max_fail_percentage 19285 1727203913.08892: done checking for max_fail_percentage 19285 1727203913.08893: checking to see if all hosts have failed and the running result is not ok 19285 1727203913.08894: done checking to see if all hosts have failed 19285 1727203913.08895: getting the remaining hosts for this loop 19285 1727203913.08896: done getting the remaining hosts for this loop 19285 1727203913.08900: getting the next task for host managed-node2 19285 1727203913.08907: done getting next task for host managed-node2 19285 1727203913.08911: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 19285 1727203913.08913: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203913.08927: getting variables 19285 1727203913.08928: in VariableManager get_vars() 19285 1727203913.08969: Calling all_inventory to load vars for managed-node2 19285 1727203913.08971: Calling groups_inventory to load vars for managed-node2 19285 1727203913.08974: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203913.08987: Calling all_plugins_play to load vars for managed-node2 19285 1727203913.08992: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203913.08995: Calling groups_plugins_play to load vars for managed-node2 19285 1727203913.09759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203913.10698: done with get_vars() 19285 1727203913.10712: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:51:53 -0400 (0:00:00.034) 0:00:12.182 ***** 19285 1727203913.10767: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 19285 1727203913.10768: Creating lock for fedora.linux_system_roles.network_connections 19285 1727203913.10978: worker is 1 (out of 1 available) 19285 1727203913.10993: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 19285 1727203913.11005: done queuing things up, now waiting for results queue to drain 19285 1727203913.11006: waiting for pending results... 19285 1727203913.11307: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 19285 1727203913.11333: in run() - task 028d2410-947f-f31b-fb3f-000000000024 19285 1727203913.11354: variable 'ansible_search_path' from source: unknown 19285 1727203913.11361: variable 'ansible_search_path' from source: unknown 19285 1727203913.11407: calling self._execute() 19285 1727203913.11785: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203913.11789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203913.11791: variable 'omit' from source: magic vars 19285 1727203913.12078: variable 'ansible_distribution_major_version' from source: facts 19285 1727203913.12095: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203913.12112: variable 'omit' from source: magic vars 19285 1727203913.12152: variable 'omit' from source: magic vars 19285 1727203913.12314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203913.14467: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203913.14544: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203913.14584: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203913.14631: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203913.14661: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203913.14748: variable 'network_provider' from source: set_fact 19285 1727203913.14886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203913.14939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203913.14970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203913.15017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203913.15044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203913.15144: variable 'omit' from source: magic vars 19285 1727203913.15227: variable 'omit' from source: magic vars 19285 1727203913.15329: variable 'network_connections' from source: play vars 19285 1727203913.15347: variable 'interface' from source: set_fact 19285 1727203913.15471: variable 'interface' from source: set_fact 19285 1727203913.15476: variable 'interface' from source: set_fact 19285 1727203913.15509: variable 'interface' from source: set_fact 19285 1727203913.15672: variable 'omit' from source: magic vars 19285 1727203913.15696: variable '__lsr_ansible_managed' from source: task vars 19285 1727203913.15756: variable '__lsr_ansible_managed' from source: task vars 19285 1727203913.15955: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 19285 1727203913.16195: Loaded config def from plugin (lookup/template) 19285 1727203913.16229: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 19285 1727203913.16248: File lookup term: get_ansible_managed.j2 19285 1727203913.16257: variable 'ansible_search_path' from source: unknown 19285 1727203913.16381: evaluation_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 19285 1727203913.16385: search_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 19285 1727203913.16389: variable 'ansible_search_path' from source: unknown 19285 1727203913.24336: variable 'ansible_managed' from source: unknown 19285 1727203913.24476: variable 'omit' from source: magic vars 19285 1727203913.24510: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203913.24539: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203913.24567: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203913.24594: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203913.24608: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203913.24636: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203913.24645: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203913.24679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203913.24756: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203913.24772: Set connection var ansible_pipelining to False 19285 1727203913.24785: Set connection var ansible_timeout to 10 19285 1727203913.24796: Set connection var ansible_shell_type to sh 19285 1727203913.24879: Set connection var ansible_shell_executable to /bin/sh 19285 1727203913.24882: Set connection var ansible_connection to ssh 19285 1727203913.24884: variable 'ansible_shell_executable' from source: unknown 19285 1727203913.24886: variable 'ansible_connection' from source: unknown 19285 1727203913.24888: variable 'ansible_module_compression' from source: unknown 19285 1727203913.24891: variable 'ansible_shell_type' from source: unknown 19285 1727203913.24893: variable 'ansible_shell_executable' from source: unknown 19285 1727203913.24895: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203913.24897: variable 'ansible_pipelining' from source: unknown 19285 1727203913.24899: variable 'ansible_timeout' from source: unknown 19285 1727203913.24905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203913.25027: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19285 1727203913.25062: variable 'omit' from source: magic vars 19285 1727203913.25074: starting attempt loop 19285 1727203913.25080: running the handler 19285 1727203913.25095: _low_level_execute_command(): starting 19285 1727203913.25100: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203913.25574: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203913.25580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 19285 1727203913.25583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203913.25585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203913.25641: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203913.25645: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203913.25647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203913.25721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203913.27534: stdout chunk (state=3): >>>/root <<< 19285 1727203913.27605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203913.27608: stdout chunk (state=3): >>><<< 19285 1727203913.27610: stderr chunk (state=3): >>><<< 19285 1727203913.27628: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203913.27645: _low_level_execute_command(): starting 19285 1727203913.27721: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203913.2763436-20502-212556493777873 `" && echo ansible-tmp-1727203913.2763436-20502-212556493777873="` echo /root/.ansible/tmp/ansible-tmp-1727203913.2763436-20502-212556493777873 `" ) && sleep 0' 19285 1727203913.28226: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203913.28293: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203913.28345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203913.28359: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203913.28383: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203913.28489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203913.30506: stdout chunk (state=3): >>>ansible-tmp-1727203913.2763436-20502-212556493777873=/root/.ansible/tmp/ansible-tmp-1727203913.2763436-20502-212556493777873 <<< 19285 1727203913.30629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203913.30632: stdout chunk (state=3): >>><<< 19285 1727203913.30634: stderr chunk (state=3): >>><<< 19285 1727203913.30637: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203913.2763436-20502-212556493777873=/root/.ansible/tmp/ansible-tmp-1727203913.2763436-20502-212556493777873 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203913.30892: variable 'ansible_module_compression' from source: unknown 19285 1727203913.30935: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 19285 1727203913.30938: ANSIBALLZ: Acquiring lock 19285 1727203913.30952: ANSIBALLZ: Lock acquired: 140487236851120 19285 1727203913.30955: ANSIBALLZ: Creating module 19285 1727203913.54859: ANSIBALLZ: Writing module into payload 19285 1727203913.55260: ANSIBALLZ: Writing module 19285 1727203913.55266: ANSIBALLZ: Renaming module 19285 1727203913.55271: ANSIBALLZ: Done creating module 19285 1727203913.55273: variable 'ansible_facts' from source: unknown 19285 1727203913.55338: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203913.2763436-20502-212556493777873/AnsiballZ_network_connections.py 19285 1727203913.55577: Sending initial data 19285 1727203913.55581: Sent initial data (168 bytes) 19285 1727203913.56190: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203913.56198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203913.56263: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203913.56284: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203913.56297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203913.56409: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203913.58103: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203913.58192: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203913.58281: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmpaa_cuft_ /root/.ansible/tmp/ansible-tmp-1727203913.2763436-20502-212556493777873/AnsiballZ_network_connections.py <<< 19285 1727203913.58285: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203913.2763436-20502-212556493777873/AnsiballZ_network_connections.py" <<< 19285 1727203913.58339: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmpaa_cuft_" to remote "/root/.ansible/tmp/ansible-tmp-1727203913.2763436-20502-212556493777873/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203913.2763436-20502-212556493777873/AnsiballZ_network_connections.py" <<< 19285 1727203913.59641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203913.59782: stderr chunk (state=3): >>><<< 19285 1727203913.59785: stdout chunk (state=3): >>><<< 19285 1727203913.59787: done transferring module to remote 19285 1727203913.59790: _low_level_execute_command(): starting 19285 1727203913.59792: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203913.2763436-20502-212556493777873/ /root/.ansible/tmp/ansible-tmp-1727203913.2763436-20502-212556493777873/AnsiballZ_network_connections.py && sleep 0' 19285 1727203913.60366: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203913.60393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203913.60428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203913.60443: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19285 1727203913.60493: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203913.60554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203913.60581: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203913.60600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203913.60698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203913.62603: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203913.62615: stderr chunk (state=3): >>><<< 19285 1727203913.62624: stdout chunk (state=3): >>><<< 19285 1727203913.62645: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203913.62653: _low_level_execute_command(): starting 19285 1727203913.62665: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203913.2763436-20502-212556493777873/AnsiballZ_network_connections.py && sleep 0' 19285 1727203913.63251: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203913.63268: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203913.63285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203913.63301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203913.63317: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203913.63327: stderr chunk (state=3): >>>debug2: match not found <<< 19285 1727203913.63339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203913.63355: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19285 1727203913.63369: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 19285 1727203913.63389: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19285 1727203913.63397: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203913.63478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203913.63493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203913.63503: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203913.63608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203913.94450: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, a5e85d14-14c9-4d10-940b-6ee660088f46\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, a5e85d14-14c9-4d10-940b-6ee660088f46 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 19285 1727203913.96737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203913.96741: stdout chunk (state=3): >>><<< 19285 1727203913.96744: stderr chunk (state=3): >>><<< 19285 1727203913.96746: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, a5e85d14-14c9-4d10-940b-6ee660088f46\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, a5e85d14-14c9-4d10-940b-6ee660088f46 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203913.96749: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'interface_name': 'LSR-TST-br31', 'state': 'up', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': True}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203913.2763436-20502-212556493777873/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203913.96751: _low_level_execute_command(): starting 19285 1727203913.96753: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203913.2763436-20502-212556493777873/ > /dev/null 2>&1 && sleep 0' 19285 1727203913.98088: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203913.98092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 19285 1727203913.98094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203913.98097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 19285 1727203913.98099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203913.98190: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203913.98279: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203913.98393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203913.98455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203914.00373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203914.00399: stderr chunk (state=3): >>><<< 19285 1727203914.00476: stdout chunk (state=3): >>><<< 19285 1727203914.00497: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203914.00512: handler run complete 19285 1727203914.00546: attempt loop complete, returning result 19285 1727203914.00552: _execute() done 19285 1727203914.00582: dumping result to json 19285 1727203914.00591: done dumping result, returning 19285 1727203914.00602: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [028d2410-947f-f31b-fb3f-000000000024] 19285 1727203914.00687: sending task result for task 028d2410-947f-f31b-fb3f-000000000024 changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "LSR-TST-br31", "ip": { "auto6": true, "dhcp4": false }, "name": "LSR-TST-br31", "state": "up", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, a5e85d14-14c9-4d10-940b-6ee660088f46 [004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, a5e85d14-14c9-4d10-940b-6ee660088f46 (not-active) 19285 1727203914.00924: no more pending results, returning what we have 19285 1727203914.00928: results queue empty 19285 1727203914.00930: checking for any_errors_fatal 19285 1727203914.00938: done checking for any_errors_fatal 19285 1727203914.00939: checking for max_fail_percentage 19285 1727203914.00941: done checking for max_fail_percentage 19285 1727203914.00942: checking to see if all hosts have failed and the running result is not ok 19285 1727203914.00943: done checking to see if all hosts have failed 19285 1727203914.00943: getting the remaining hosts for this loop 19285 1727203914.00945: done getting the remaining hosts for this loop 19285 1727203914.00950: getting the next task for host managed-node2 19285 1727203914.00957: done getting next task for host managed-node2 19285 1727203914.00963: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 19285 1727203914.00965: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203914.01379: getting variables 19285 1727203914.01381: in VariableManager get_vars() 19285 1727203914.01423: Calling all_inventory to load vars for managed-node2 19285 1727203914.01427: Calling groups_inventory to load vars for managed-node2 19285 1727203914.01429: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203914.01441: Calling all_plugins_play to load vars for managed-node2 19285 1727203914.01444: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203914.01447: Calling groups_plugins_play to load vars for managed-node2 19285 1727203914.02586: done sending task result for task 028d2410-947f-f31b-fb3f-000000000024 19285 1727203914.02590: WORKER PROCESS EXITING 19285 1727203914.04695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203914.07847: done with get_vars() 19285 1727203914.07882: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:51:54 -0400 (0:00:00.971) 0:00:13.154 ***** 19285 1727203914.07966: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 19285 1727203914.07968: Creating lock for fedora.linux_system_roles.network_state 19285 1727203914.08741: worker is 1 (out of 1 available) 19285 1727203914.08755: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 19285 1727203914.08772: done queuing things up, now waiting for results queue to drain 19285 1727203914.08773: waiting for pending results... 19285 1727203914.09239: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 19285 1727203914.09344: in run() - task 028d2410-947f-f31b-fb3f-000000000025 19285 1727203914.09600: variable 'ansible_search_path' from source: unknown 19285 1727203914.09780: variable 'ansible_search_path' from source: unknown 19285 1727203914.09784: calling self._execute() 19285 1727203914.09787: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203914.09789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203914.09791: variable 'omit' from source: magic vars 19285 1727203914.10504: variable 'ansible_distribution_major_version' from source: facts 19285 1727203914.10523: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203914.10981: variable 'network_state' from source: role '' defaults 19285 1727203914.10984: Evaluated conditional (network_state != {}): False 19285 1727203914.10987: when evaluation is False, skipping this task 19285 1727203914.10989: _execute() done 19285 1727203914.10991: dumping result to json 19285 1727203914.10993: done dumping result, returning 19285 1727203914.10995: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [028d2410-947f-f31b-fb3f-000000000025] 19285 1727203914.10997: sending task result for task 028d2410-947f-f31b-fb3f-000000000025 19285 1727203914.11064: done sending task result for task 028d2410-947f-f31b-fb3f-000000000025 19285 1727203914.11069: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19285 1727203914.11124: no more pending results, returning what we have 19285 1727203914.11130: results queue empty 19285 1727203914.11131: checking for any_errors_fatal 19285 1727203914.11146: done checking for any_errors_fatal 19285 1727203914.11147: checking for max_fail_percentage 19285 1727203914.11149: done checking for max_fail_percentage 19285 1727203914.11150: checking to see if all hosts have failed and the running result is not ok 19285 1727203914.11151: done checking to see if all hosts have failed 19285 1727203914.11152: getting the remaining hosts for this loop 19285 1727203914.11153: done getting the remaining hosts for this loop 19285 1727203914.11158: getting the next task for host managed-node2 19285 1727203914.11169: done getting next task for host managed-node2 19285 1727203914.11175: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 19285 1727203914.11180: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203914.11196: getting variables 19285 1727203914.11198: in VariableManager get_vars() 19285 1727203914.11239: Calling all_inventory to load vars for managed-node2 19285 1727203914.11243: Calling groups_inventory to load vars for managed-node2 19285 1727203914.11245: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203914.11258: Calling all_plugins_play to load vars for managed-node2 19285 1727203914.11262: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203914.11266: Calling groups_plugins_play to load vars for managed-node2 19285 1727203914.14278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203914.17432: done with get_vars() 19285 1727203914.17469: done getting variables 19285 1727203914.17734: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:51:54 -0400 (0:00:00.098) 0:00:13.252 ***** 19285 1727203914.17770: entering _queue_task() for managed-node2/debug 19285 1727203914.18519: worker is 1 (out of 1 available) 19285 1727203914.18530: exiting _queue_task() for managed-node2/debug 19285 1727203914.18540: done queuing things up, now waiting for results queue to drain 19285 1727203914.18542: waiting for pending results... 19285 1727203914.18864: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 19285 1727203914.19130: in run() - task 028d2410-947f-f31b-fb3f-000000000026 19285 1727203914.19203: variable 'ansible_search_path' from source: unknown 19285 1727203914.19211: variable 'ansible_search_path' from source: unknown 19285 1727203914.19255: calling self._execute() 19285 1727203914.19581: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203914.19586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203914.19589: variable 'omit' from source: magic vars 19285 1727203914.20381: variable 'ansible_distribution_major_version' from source: facts 19285 1727203914.20581: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203914.20584: variable 'omit' from source: magic vars 19285 1727203914.20587: variable 'omit' from source: magic vars 19285 1727203914.20589: variable 'omit' from source: magic vars 19285 1727203914.20591: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203914.20593: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203914.20595: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203914.20689: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203914.20704: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203914.20810: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203914.20820: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203914.20829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203914.21136: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203914.21381: Set connection var ansible_pipelining to False 19285 1727203914.21384: Set connection var ansible_timeout to 10 19285 1727203914.21388: Set connection var ansible_shell_type to sh 19285 1727203914.21391: Set connection var ansible_shell_executable to /bin/sh 19285 1727203914.21393: Set connection var ansible_connection to ssh 19285 1727203914.21396: variable 'ansible_shell_executable' from source: unknown 19285 1727203914.21399: variable 'ansible_connection' from source: unknown 19285 1727203914.21402: variable 'ansible_module_compression' from source: unknown 19285 1727203914.21404: variable 'ansible_shell_type' from source: unknown 19285 1727203914.21407: variable 'ansible_shell_executable' from source: unknown 19285 1727203914.21409: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203914.21412: variable 'ansible_pipelining' from source: unknown 19285 1727203914.21414: variable 'ansible_timeout' from source: unknown 19285 1727203914.21416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203914.21423: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203914.21426: variable 'omit' from source: magic vars 19285 1727203914.21428: starting attempt loop 19285 1727203914.21430: running the handler 19285 1727203914.21789: variable '__network_connections_result' from source: set_fact 19285 1727203914.21847: handler run complete 19285 1727203914.22281: attempt loop complete, returning result 19285 1727203914.22284: _execute() done 19285 1727203914.22286: dumping result to json 19285 1727203914.22289: done dumping result, returning 19285 1727203914.22293: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [028d2410-947f-f31b-fb3f-000000000026] 19285 1727203914.22295: sending task result for task 028d2410-947f-f31b-fb3f-000000000026 19285 1727203914.22368: done sending task result for task 028d2410-947f-f31b-fb3f-000000000026 19285 1727203914.22372: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, a5e85d14-14c9-4d10-940b-6ee660088f46", "[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, a5e85d14-14c9-4d10-940b-6ee660088f46 (not-active)" ] } 19285 1727203914.22526: no more pending results, returning what we have 19285 1727203914.22529: results queue empty 19285 1727203914.22530: checking for any_errors_fatal 19285 1727203914.22537: done checking for any_errors_fatal 19285 1727203914.22537: checking for max_fail_percentage 19285 1727203914.22539: done checking for max_fail_percentage 19285 1727203914.22540: checking to see if all hosts have failed and the running result is not ok 19285 1727203914.22541: done checking to see if all hosts have failed 19285 1727203914.22542: getting the remaining hosts for this loop 19285 1727203914.22544: done getting the remaining hosts for this loop 19285 1727203914.22548: getting the next task for host managed-node2 19285 1727203914.22554: done getting next task for host managed-node2 19285 1727203914.22557: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 19285 1727203914.22559: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203914.22678: getting variables 19285 1727203914.22680: in VariableManager get_vars() 19285 1727203914.22714: Calling all_inventory to load vars for managed-node2 19285 1727203914.22717: Calling groups_inventory to load vars for managed-node2 19285 1727203914.22719: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203914.22728: Calling all_plugins_play to load vars for managed-node2 19285 1727203914.22730: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203914.22732: Calling groups_plugins_play to load vars for managed-node2 19285 1727203914.25487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203914.28567: done with get_vars() 19285 1727203914.28600: done getting variables 19285 1727203914.28663: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:51:54 -0400 (0:00:00.109) 0:00:13.361 ***** 19285 1727203914.28698: entering _queue_task() for managed-node2/debug 19285 1727203914.29509: worker is 1 (out of 1 available) 19285 1727203914.29522: exiting _queue_task() for managed-node2/debug 19285 1727203914.29533: done queuing things up, now waiting for results queue to drain 19285 1727203914.29534: waiting for pending results... 19285 1727203914.29864: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 19285 1727203914.29958: in run() - task 028d2410-947f-f31b-fb3f-000000000027 19285 1727203914.30126: variable 'ansible_search_path' from source: unknown 19285 1727203914.30135: variable 'ansible_search_path' from source: unknown 19285 1727203914.30184: calling self._execute() 19285 1727203914.30356: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203914.30443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203914.30459: variable 'omit' from source: magic vars 19285 1727203914.31317: variable 'ansible_distribution_major_version' from source: facts 19285 1727203914.31333: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203914.31343: variable 'omit' from source: magic vars 19285 1727203914.31390: variable 'omit' from source: magic vars 19285 1727203914.31502: variable 'omit' from source: magic vars 19285 1727203914.31553: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203914.31671: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203914.31696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203914.31762: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203914.31781: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203914.31814: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203914.31887: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203914.31896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203914.32119: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203914.32132: Set connection var ansible_pipelining to False 19285 1727203914.32142: Set connection var ansible_timeout to 10 19285 1727203914.32149: Set connection var ansible_shell_type to sh 19285 1727203914.32163: Set connection var ansible_shell_executable to /bin/sh 19285 1727203914.32383: Set connection var ansible_connection to ssh 19285 1727203914.32386: variable 'ansible_shell_executable' from source: unknown 19285 1727203914.32389: variable 'ansible_connection' from source: unknown 19285 1727203914.32392: variable 'ansible_module_compression' from source: unknown 19285 1727203914.32395: variable 'ansible_shell_type' from source: unknown 19285 1727203914.32397: variable 'ansible_shell_executable' from source: unknown 19285 1727203914.32400: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203914.32402: variable 'ansible_pipelining' from source: unknown 19285 1727203914.32405: variable 'ansible_timeout' from source: unknown 19285 1727203914.32407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203914.32621: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203914.32637: variable 'omit' from source: magic vars 19285 1727203914.32646: starting attempt loop 19285 1727203914.32781: running the handler 19285 1727203914.32784: variable '__network_connections_result' from source: set_fact 19285 1727203914.32883: variable '__network_connections_result' from source: set_fact 19285 1727203914.33363: handler run complete 19285 1727203914.33366: attempt loop complete, returning result 19285 1727203914.33368: _execute() done 19285 1727203914.33370: dumping result to json 19285 1727203914.33372: done dumping result, returning 19285 1727203914.33377: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [028d2410-947f-f31b-fb3f-000000000027] 19285 1727203914.33379: sending task result for task 028d2410-947f-f31b-fb3f-000000000027 ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "LSR-TST-br31", "ip": { "auto6": true, "dhcp4": false }, "name": "LSR-TST-br31", "state": "up", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, a5e85d14-14c9-4d10-940b-6ee660088f46\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, a5e85d14-14c9-4d10-940b-6ee660088f46 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, a5e85d14-14c9-4d10-940b-6ee660088f46", "[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, a5e85d14-14c9-4d10-940b-6ee660088f46 (not-active)" ] } } 19285 1727203914.33593: no more pending results, returning what we have 19285 1727203914.33598: results queue empty 19285 1727203914.33599: checking for any_errors_fatal 19285 1727203914.33606: done checking for any_errors_fatal 19285 1727203914.33606: checking for max_fail_percentage 19285 1727203914.33608: done checking for max_fail_percentage 19285 1727203914.33609: checking to see if all hosts have failed and the running result is not ok 19285 1727203914.33610: done checking to see if all hosts have failed 19285 1727203914.33611: getting the remaining hosts for this loop 19285 1727203914.33612: done getting the remaining hosts for this loop 19285 1727203914.33616: getting the next task for host managed-node2 19285 1727203914.33624: done getting next task for host managed-node2 19285 1727203914.33627: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 19285 1727203914.33629: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203914.33637: getting variables 19285 1727203914.33639: in VariableManager get_vars() 19285 1727203914.33672: Calling all_inventory to load vars for managed-node2 19285 1727203914.33675: Calling groups_inventory to load vars for managed-node2 19285 1727203914.33782: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203914.33795: Calling all_plugins_play to load vars for managed-node2 19285 1727203914.33798: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203914.33801: Calling groups_plugins_play to load vars for managed-node2 19285 1727203914.34366: done sending task result for task 028d2410-947f-f31b-fb3f-000000000027 19285 1727203914.34371: WORKER PROCESS EXITING 19285 1727203914.35801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203914.37666: done with get_vars() 19285 1727203914.37841: done getting variables 19285 1727203914.38004: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:51:54 -0400 (0:00:00.093) 0:00:13.454 ***** 19285 1727203914.38042: entering _queue_task() for managed-node2/debug 19285 1727203914.38423: worker is 1 (out of 1 available) 19285 1727203914.38438: exiting _queue_task() for managed-node2/debug 19285 1727203914.38451: done queuing things up, now waiting for results queue to drain 19285 1727203914.38452: waiting for pending results... 19285 1727203914.38793: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 19285 1727203914.38837: in run() - task 028d2410-947f-f31b-fb3f-000000000028 19285 1727203914.38859: variable 'ansible_search_path' from source: unknown 19285 1727203914.38867: variable 'ansible_search_path' from source: unknown 19285 1727203914.38919: calling self._execute() 19285 1727203914.39025: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203914.39038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203914.39055: variable 'omit' from source: magic vars 19285 1727203914.39471: variable 'ansible_distribution_major_version' from source: facts 19285 1727203914.39491: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203914.39621: variable 'network_state' from source: role '' defaults 19285 1727203914.39680: Evaluated conditional (network_state != {}): False 19285 1727203914.39683: when evaluation is False, skipping this task 19285 1727203914.39686: _execute() done 19285 1727203914.39688: dumping result to json 19285 1727203914.39690: done dumping result, returning 19285 1727203914.39693: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [028d2410-947f-f31b-fb3f-000000000028] 19285 1727203914.39694: sending task result for task 028d2410-947f-f31b-fb3f-000000000028 skipping: [managed-node2] => { "false_condition": "network_state != {}" } 19285 1727203914.39942: no more pending results, returning what we have 19285 1727203914.39946: results queue empty 19285 1727203914.39948: checking for any_errors_fatal 19285 1727203914.39960: done checking for any_errors_fatal 19285 1727203914.39961: checking for max_fail_percentage 19285 1727203914.39963: done checking for max_fail_percentage 19285 1727203914.39964: checking to see if all hosts have failed and the running result is not ok 19285 1727203914.39964: done checking to see if all hosts have failed 19285 1727203914.39965: getting the remaining hosts for this loop 19285 1727203914.39967: done getting the remaining hosts for this loop 19285 1727203914.39971: getting the next task for host managed-node2 19285 1727203914.39981: done getting next task for host managed-node2 19285 1727203914.39985: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 19285 1727203914.39988: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203914.40080: getting variables 19285 1727203914.40083: in VariableManager get_vars() 19285 1727203914.40236: Calling all_inventory to load vars for managed-node2 19285 1727203914.40240: Calling groups_inventory to load vars for managed-node2 19285 1727203914.40243: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203914.40330: done sending task result for task 028d2410-947f-f31b-fb3f-000000000028 19285 1727203914.40333: WORKER PROCESS EXITING 19285 1727203914.40347: Calling all_plugins_play to load vars for managed-node2 19285 1727203914.40350: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203914.40354: Calling groups_plugins_play to load vars for managed-node2 19285 1727203914.42616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203914.45290: done with get_vars() 19285 1727203914.45327: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:51:54 -0400 (0:00:00.074) 0:00:13.529 ***** 19285 1727203914.45541: entering _queue_task() for managed-node2/ping 19285 1727203914.45543: Creating lock for ping 19285 1727203914.46289: worker is 1 (out of 1 available) 19285 1727203914.46302: exiting _queue_task() for managed-node2/ping 19285 1727203914.46324: done queuing things up, now waiting for results queue to drain 19285 1727203914.46326: waiting for pending results... 19285 1727203914.46638: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 19285 1727203914.47182: in run() - task 028d2410-947f-f31b-fb3f-000000000029 19285 1727203914.47185: variable 'ansible_search_path' from source: unknown 19285 1727203914.47188: variable 'ansible_search_path' from source: unknown 19285 1727203914.47190: calling self._execute() 19285 1727203914.47192: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203914.47194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203914.47393: variable 'omit' from source: magic vars 19285 1727203914.48423: variable 'ansible_distribution_major_version' from source: facts 19285 1727203914.48489: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203914.48533: variable 'omit' from source: magic vars 19285 1727203914.48633: variable 'omit' from source: magic vars 19285 1727203914.48683: variable 'omit' from source: magic vars 19285 1727203914.48732: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203914.48779: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203914.48981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203914.48984: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203914.48986: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203914.48989: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203914.48991: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203914.48992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203914.49178: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203914.49292: Set connection var ansible_pipelining to False 19285 1727203914.49303: Set connection var ansible_timeout to 10 19285 1727203914.49309: Set connection var ansible_shell_type to sh 19285 1727203914.49320: Set connection var ansible_shell_executable to /bin/sh 19285 1727203914.49327: Set connection var ansible_connection to ssh 19285 1727203914.49356: variable 'ansible_shell_executable' from source: unknown 19285 1727203914.49680: variable 'ansible_connection' from source: unknown 19285 1727203914.49684: variable 'ansible_module_compression' from source: unknown 19285 1727203914.49686: variable 'ansible_shell_type' from source: unknown 19285 1727203914.49688: variable 'ansible_shell_executable' from source: unknown 19285 1727203914.49690: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203914.49692: variable 'ansible_pipelining' from source: unknown 19285 1727203914.49694: variable 'ansible_timeout' from source: unknown 19285 1727203914.49696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203914.49913: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19285 1727203914.49934: variable 'omit' from source: magic vars 19285 1727203914.49945: starting attempt loop 19285 1727203914.49953: running the handler 19285 1727203914.49979: _low_level_execute_command(): starting 19285 1727203914.49993: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203914.50999: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203914.51020: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203914.51044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203914.51209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203914.52934: stdout chunk (state=3): >>>/root <<< 19285 1727203914.53095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203914.53193: stderr chunk (state=3): >>><<< 19285 1727203914.53204: stdout chunk (state=3): >>><<< 19285 1727203914.53385: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203914.53389: _low_level_execute_command(): starting 19285 1727203914.53392: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203914.5329447-20598-64579051758518 `" && echo ansible-tmp-1727203914.5329447-20598-64579051758518="` echo /root/.ansible/tmp/ansible-tmp-1727203914.5329447-20598-64579051758518 `" ) && sleep 0' 19285 1727203914.54628: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203914.54702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203914.54920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203914.54936: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203914.55029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203914.56979: stdout chunk (state=3): >>>ansible-tmp-1727203914.5329447-20598-64579051758518=/root/.ansible/tmp/ansible-tmp-1727203914.5329447-20598-64579051758518 <<< 19285 1727203914.57385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203914.57390: stderr chunk (state=3): >>><<< 19285 1727203914.57392: stdout chunk (state=3): >>><<< 19285 1727203914.57395: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203914.5329447-20598-64579051758518=/root/.ansible/tmp/ansible-tmp-1727203914.5329447-20598-64579051758518 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203914.57397: variable 'ansible_module_compression' from source: unknown 19285 1727203914.57400: ANSIBALLZ: Using lock for ping 19285 1727203914.57401: ANSIBALLZ: Acquiring lock 19285 1727203914.57403: ANSIBALLZ: Lock acquired: 140487235148000 19285 1727203914.57405: ANSIBALLZ: Creating module 19285 1727203914.83262: ANSIBALLZ: Writing module into payload 19285 1727203914.83471: ANSIBALLZ: Writing module 19285 1727203914.83491: ANSIBALLZ: Renaming module 19285 1727203914.83498: ANSIBALLZ: Done creating module 19285 1727203914.83514: variable 'ansible_facts' from source: unknown 19285 1727203914.83635: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203914.5329447-20598-64579051758518/AnsiballZ_ping.py 19285 1727203914.83992: Sending initial data 19285 1727203914.83996: Sent initial data (152 bytes) 19285 1727203914.85984: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203914.86172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203914.86178: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 19285 1727203914.86182: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203914.86185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203914.86200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203914.86343: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203914.86374: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203914.86708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203914.88332: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203914.88413: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203914.88511: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmpdcfolxbd /root/.ansible/tmp/ansible-tmp-1727203914.5329447-20598-64579051758518/AnsiballZ_ping.py <<< 19285 1727203914.88515: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203914.5329447-20598-64579051758518/AnsiballZ_ping.py" <<< 19285 1727203914.88586: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmpdcfolxbd" to remote "/root/.ansible/tmp/ansible-tmp-1727203914.5329447-20598-64579051758518/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203914.5329447-20598-64579051758518/AnsiballZ_ping.py" <<< 19285 1727203914.89522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203914.89650: stderr chunk (state=3): >>><<< 19285 1727203914.89652: stdout chunk (state=3): >>><<< 19285 1727203914.89654: done transferring module to remote 19285 1727203914.89686: _low_level_execute_command(): starting 19285 1727203914.89696: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203914.5329447-20598-64579051758518/ /root/.ansible/tmp/ansible-tmp-1727203914.5329447-20598-64579051758518/AnsiballZ_ping.py && sleep 0' 19285 1727203914.90653: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203914.90727: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203914.90895: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203914.91011: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203914.91206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203914.93082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203914.93087: stdout chunk (state=3): >>><<< 19285 1727203914.93089: stderr chunk (state=3): >>><<< 19285 1727203914.93107: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203914.93194: _low_level_execute_command(): starting 19285 1727203914.93197: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203914.5329447-20598-64579051758518/AnsiballZ_ping.py && sleep 0' 19285 1727203914.93741: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203914.93756: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203914.93773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203914.93795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203914.93811: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203914.93908: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203914.93922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203914.94031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203915.08978: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 19285 1727203915.10307: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203915.10311: stdout chunk (state=3): >>><<< 19285 1727203915.10324: stderr chunk (state=3): >>><<< 19285 1727203915.10336: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203915.10364: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203914.5329447-20598-64579051758518/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203915.10380: _low_level_execute_command(): starting 19285 1727203915.10388: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203914.5329447-20598-64579051758518/ > /dev/null 2>&1 && sleep 0' 19285 1727203915.11095: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203915.11137: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203915.11150: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203915.11168: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203915.11265: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203915.13143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203915.13247: stderr chunk (state=3): >>><<< 19285 1727203915.13250: stdout chunk (state=3): >>><<< 19285 1727203915.13396: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203915.13406: handler run complete 19285 1727203915.13408: attempt loop complete, returning result 19285 1727203915.13411: _execute() done 19285 1727203915.13413: dumping result to json 19285 1727203915.13415: done dumping result, returning 19285 1727203915.13418: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [028d2410-947f-f31b-fb3f-000000000029] 19285 1727203915.13420: sending task result for task 028d2410-947f-f31b-fb3f-000000000029 19285 1727203915.13499: done sending task result for task 028d2410-947f-f31b-fb3f-000000000029 19285 1727203915.13503: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 19285 1727203915.13579: no more pending results, returning what we have 19285 1727203915.13583: results queue empty 19285 1727203915.13585: checking for any_errors_fatal 19285 1727203915.13593: done checking for any_errors_fatal 19285 1727203915.13593: checking for max_fail_percentage 19285 1727203915.13595: done checking for max_fail_percentage 19285 1727203915.13596: checking to see if all hosts have failed and the running result is not ok 19285 1727203915.13597: done checking to see if all hosts have failed 19285 1727203915.13598: getting the remaining hosts for this loop 19285 1727203915.13600: done getting the remaining hosts for this loop 19285 1727203915.13603: getting the next task for host managed-node2 19285 1727203915.13611: done getting next task for host managed-node2 19285 1727203915.13614: ^ task is: TASK: meta (role_complete) 19285 1727203915.13616: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203915.13627: getting variables 19285 1727203915.13630: in VariableManager get_vars() 19285 1727203915.13667: Calling all_inventory to load vars for managed-node2 19285 1727203915.13670: Calling groups_inventory to load vars for managed-node2 19285 1727203915.13673: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203915.13889: Calling all_plugins_play to load vars for managed-node2 19285 1727203915.13893: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203915.13897: Calling groups_plugins_play to load vars for managed-node2 19285 1727203915.16530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203915.18852: done with get_vars() 19285 1727203915.18934: done getting variables 19285 1727203915.19117: done queuing things up, now waiting for results queue to drain 19285 1727203915.19119: results queue empty 19285 1727203915.19120: checking for any_errors_fatal 19285 1727203915.19122: done checking for any_errors_fatal 19285 1727203915.19123: checking for max_fail_percentage 19285 1727203915.19124: done checking for max_fail_percentage 19285 1727203915.19125: checking to see if all hosts have failed and the running result is not ok 19285 1727203915.19126: done checking to see if all hosts have failed 19285 1727203915.19126: getting the remaining hosts for this loop 19285 1727203915.19127: done getting the remaining hosts for this loop 19285 1727203915.19130: getting the next task for host managed-node2 19285 1727203915.19134: done getting next task for host managed-node2 19285 1727203915.19135: ^ task is: TASK: meta (flush_handlers) 19285 1727203915.19137: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203915.19140: getting variables 19285 1727203915.19141: in VariableManager get_vars() 19285 1727203915.19161: Calling all_inventory to load vars for managed-node2 19285 1727203915.19164: Calling groups_inventory to load vars for managed-node2 19285 1727203915.19167: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203915.19174: Calling all_plugins_play to load vars for managed-node2 19285 1727203915.19283: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203915.19288: Calling groups_plugins_play to load vars for managed-node2 19285 1727203915.22323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203915.26040: done with get_vars() 19285 1727203915.26064: done getting variables 19285 1727203915.26146: in VariableManager get_vars() 19285 1727203915.26159: Calling all_inventory to load vars for managed-node2 19285 1727203915.26164: Calling groups_inventory to load vars for managed-node2 19285 1727203915.26166: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203915.26171: Calling all_plugins_play to load vars for managed-node2 19285 1727203915.26196: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203915.26201: Calling groups_plugins_play to load vars for managed-node2 19285 1727203915.28098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203915.30786: done with get_vars() 19285 1727203915.30879: done queuing things up, now waiting for results queue to drain 19285 1727203915.30882: results queue empty 19285 1727203915.30884: checking for any_errors_fatal 19285 1727203915.30886: done checking for any_errors_fatal 19285 1727203915.30886: checking for max_fail_percentage 19285 1727203915.30887: done checking for max_fail_percentage 19285 1727203915.30888: checking to see if all hosts have failed and the running result is not ok 19285 1727203915.30889: done checking to see if all hosts have failed 19285 1727203915.30890: getting the remaining hosts for this loop 19285 1727203915.30891: done getting the remaining hosts for this loop 19285 1727203915.30893: getting the next task for host managed-node2 19285 1727203915.30899: done getting next task for host managed-node2 19285 1727203915.30900: ^ task is: TASK: meta (flush_handlers) 19285 1727203915.30902: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203915.30905: getting variables 19285 1727203915.30906: in VariableManager get_vars() 19285 1727203915.30986: Calling all_inventory to load vars for managed-node2 19285 1727203915.30989: Calling groups_inventory to load vars for managed-node2 19285 1727203915.30991: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203915.30997: Calling all_plugins_play to load vars for managed-node2 19285 1727203915.30999: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203915.31002: Calling groups_plugins_play to load vars for managed-node2 19285 1727203915.34057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203915.36947: done with get_vars() 19285 1727203915.36973: done getting variables 19285 1727203915.37029: in VariableManager get_vars() 19285 1727203915.37044: Calling all_inventory to load vars for managed-node2 19285 1727203915.37046: Calling groups_inventory to load vars for managed-node2 19285 1727203915.37048: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203915.37054: Calling all_plugins_play to load vars for managed-node2 19285 1727203915.37056: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203915.37059: Calling groups_plugins_play to load vars for managed-node2 19285 1727203915.39817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203915.42073: done with get_vars() 19285 1727203915.42107: done queuing things up, now waiting for results queue to drain 19285 1727203915.42109: results queue empty 19285 1727203915.42110: checking for any_errors_fatal 19285 1727203915.42111: done checking for any_errors_fatal 19285 1727203915.42112: checking for max_fail_percentage 19285 1727203915.42113: done checking for max_fail_percentage 19285 1727203915.42114: checking to see if all hosts have failed and the running result is not ok 19285 1727203915.42114: done checking to see if all hosts have failed 19285 1727203915.42115: getting the remaining hosts for this loop 19285 1727203915.42116: done getting the remaining hosts for this loop 19285 1727203915.42119: getting the next task for host managed-node2 19285 1727203915.42123: done getting next task for host managed-node2 19285 1727203915.42123: ^ task is: None 19285 1727203915.42125: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203915.42126: done queuing things up, now waiting for results queue to drain 19285 1727203915.42127: results queue empty 19285 1727203915.42128: checking for any_errors_fatal 19285 1727203915.42129: done checking for any_errors_fatal 19285 1727203915.42129: checking for max_fail_percentage 19285 1727203915.42130: done checking for max_fail_percentage 19285 1727203915.42131: checking to see if all hosts have failed and the running result is not ok 19285 1727203915.42132: done checking to see if all hosts have failed 19285 1727203915.42133: getting the next task for host managed-node2 19285 1727203915.42135: done getting next task for host managed-node2 19285 1727203915.42136: ^ task is: None 19285 1727203915.42140: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203915.42189: in VariableManager get_vars() 19285 1727203915.42205: done with get_vars() 19285 1727203915.42211: in VariableManager get_vars() 19285 1727203915.42220: done with get_vars() 19285 1727203915.42224: variable 'omit' from source: magic vars 19285 1727203915.42348: variable 'task' from source: play vars 19285 1727203915.42391: in VariableManager get_vars() 19285 1727203915.42403: done with get_vars() 19285 1727203915.42421: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_device_present.yml] ************************ 19285 1727203915.42599: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19285 1727203915.42621: getting the remaining hosts for this loop 19285 1727203915.42622: done getting the remaining hosts for this loop 19285 1727203915.42625: getting the next task for host managed-node2 19285 1727203915.42628: done getting next task for host managed-node2 19285 1727203915.42630: ^ task is: TASK: Gathering Facts 19285 1727203915.42631: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203915.42633: getting variables 19285 1727203915.42634: in VariableManager get_vars() 19285 1727203915.42642: Calling all_inventory to load vars for managed-node2 19285 1727203915.42644: Calling groups_inventory to load vars for managed-node2 19285 1727203915.42647: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203915.42652: Calling all_plugins_play to load vars for managed-node2 19285 1727203915.42656: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203915.42659: Calling groups_plugins_play to load vars for managed-node2 19285 1727203915.44297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203915.46447: done with get_vars() 19285 1727203915.46472: done getting variables 19285 1727203915.46518: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Tuesday 24 September 2024 14:51:55 -0400 (0:00:01.010) 0:00:14.539 ***** 19285 1727203915.46544: entering _queue_task() for managed-node2/gather_facts 19285 1727203915.46871: worker is 1 (out of 1 available) 19285 1727203915.47083: exiting _queue_task() for managed-node2/gather_facts 19285 1727203915.47093: done queuing things up, now waiting for results queue to drain 19285 1727203915.47094: waiting for pending results... 19285 1727203915.47511: running TaskExecutor() for managed-node2/TASK: Gathering Facts 19285 1727203915.47636: in run() - task 028d2410-947f-f31b-fb3f-000000000219 19285 1727203915.47854: variable 'ansible_search_path' from source: unknown 19285 1727203915.47858: calling self._execute() 19285 1727203915.47951: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203915.48050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203915.48074: variable 'omit' from source: magic vars 19285 1727203915.48881: variable 'ansible_distribution_major_version' from source: facts 19285 1727203915.48899: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203915.48910: variable 'omit' from source: magic vars 19285 1727203915.48948: variable 'omit' from source: magic vars 19285 1727203915.49069: variable 'omit' from source: magic vars 19285 1727203915.49133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203915.49285: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203915.49311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203915.49380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203915.49398: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203915.49578: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203915.49581: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203915.49583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203915.49709: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203915.49723: Set connection var ansible_pipelining to False 19285 1727203915.49735: Set connection var ansible_timeout to 10 19285 1727203915.49743: Set connection var ansible_shell_type to sh 19285 1727203915.49902: Set connection var ansible_shell_executable to /bin/sh 19285 1727203915.49905: Set connection var ansible_connection to ssh 19285 1727203915.49908: variable 'ansible_shell_executable' from source: unknown 19285 1727203915.49910: variable 'ansible_connection' from source: unknown 19285 1727203915.49912: variable 'ansible_module_compression' from source: unknown 19285 1727203915.49914: variable 'ansible_shell_type' from source: unknown 19285 1727203915.49916: variable 'ansible_shell_executable' from source: unknown 19285 1727203915.49918: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203915.49921: variable 'ansible_pipelining' from source: unknown 19285 1727203915.49923: variable 'ansible_timeout' from source: unknown 19285 1727203915.49924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203915.50448: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203915.50452: variable 'omit' from source: magic vars 19285 1727203915.50455: starting attempt loop 19285 1727203915.50457: running the handler 19285 1727203915.50459: variable 'ansible_facts' from source: unknown 19285 1727203915.50464: _low_level_execute_command(): starting 19285 1727203915.50466: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203915.52201: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203915.52302: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203915.52419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203915.54135: stdout chunk (state=3): >>>/root <<< 19285 1727203915.54264: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203915.54470: stderr chunk (state=3): >>><<< 19285 1727203915.54473: stdout chunk (state=3): >>><<< 19285 1727203915.54477: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203915.54480: _low_level_execute_command(): starting 19285 1727203915.54482: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203915.5441031-20712-201722051354943 `" && echo ansible-tmp-1727203915.5441031-20712-201722051354943="` echo /root/.ansible/tmp/ansible-tmp-1727203915.5441031-20712-201722051354943 `" ) && sleep 0' 19285 1727203915.55784: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203915.55924: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203915.56017: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203915.56119: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203915.56188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203915.58200: stdout chunk (state=3): >>>ansible-tmp-1727203915.5441031-20712-201722051354943=/root/.ansible/tmp/ansible-tmp-1727203915.5441031-20712-201722051354943 <<< 19285 1727203915.58334: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203915.58482: stderr chunk (state=3): >>><<< 19285 1727203915.58485: stdout chunk (state=3): >>><<< 19285 1727203915.58489: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203915.5441031-20712-201722051354943=/root/.ansible/tmp/ansible-tmp-1727203915.5441031-20712-201722051354943 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203915.58492: variable 'ansible_module_compression' from source: unknown 19285 1727203915.58617: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19285 1727203915.58751: variable 'ansible_facts' from source: unknown 19285 1727203915.59258: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203915.5441031-20712-201722051354943/AnsiballZ_setup.py 19285 1727203915.59605: Sending initial data 19285 1727203915.59609: Sent initial data (154 bytes) 19285 1727203915.60893: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203915.60929: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203915.60947: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203915.61000: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203915.61124: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203915.62713: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 19285 1727203915.62730: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 19285 1727203915.62742: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 19285 1727203915.62753: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 19285 1727203915.62787: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203915.62850: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203915.62944: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmpqp10jk77 /root/.ansible/tmp/ansible-tmp-1727203915.5441031-20712-201722051354943/AnsiballZ_setup.py <<< 19285 1727203915.62954: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203915.5441031-20712-201722051354943/AnsiballZ_setup.py" <<< 19285 1727203915.63147: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmpqp10jk77" to remote "/root/.ansible/tmp/ansible-tmp-1727203915.5441031-20712-201722051354943/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203915.5441031-20712-201722051354943/AnsiballZ_setup.py" <<< 19285 1727203915.66110: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203915.66230: stderr chunk (state=3): >>><<< 19285 1727203915.66234: stdout chunk (state=3): >>><<< 19285 1727203915.66236: done transferring module to remote 19285 1727203915.66387: _low_level_execute_command(): starting 19285 1727203915.66390: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203915.5441031-20712-201722051354943/ /root/.ansible/tmp/ansible-tmp-1727203915.5441031-20712-201722051354943/AnsiballZ_setup.py && sleep 0' 19285 1727203915.67605: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203915.67834: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203915.68017: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203915.68292: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203915.69890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203915.69927: stderr chunk (state=3): >>><<< 19285 1727203915.69930: stdout chunk (state=3): >>><<< 19285 1727203915.69946: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203915.69954: _low_level_execute_command(): starting 19285 1727203915.69969: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203915.5441031-20712-201722051354943/AnsiballZ_setup.py && sleep 0' 19285 1727203915.71344: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203915.71484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203915.71558: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203916.35227: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.63134765625, "5m": 0.41748046875, "15m": 0.20556640625}, "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2928, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 603, "free": 2928}, "nocache": {"free": 3284, "used": 247}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_versi<<< 19285 1727203916.35427: stdout chunk (state=3): >>>on": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 502, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261788045312, "block_size": 4096, "block_total": 65519099, "block_available": 63913097, "block_used": 1606002, "inode_total": 131070960, "inode_available": 131027263, "inode_used": 43697, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["lo", "eth0", "LSR-TST-br31"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "92:1a:91:1f:57:29", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "51", "second": "56", "epoch": "1727203916", "epoch_int": "1727203916", "date": "2024-09-24", "time": "14:51:56", "iso8601_micro": "2024-09-24T18:51:56.348971Z", "iso8601": "2024-09-24T18:51:56Z", "iso8601_basic": "20240924T145156348971", "iso8601_basic_short": "20240924T145156", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_iscsi_iqn": "", "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19285 1727203916.37267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203916.37292: stderr chunk (state=3): >>>Shared connection to 10.31.13.254 closed. <<< 19285 1727203916.37344: stderr chunk (state=3): >>><<< 19285 1727203916.37396: stdout chunk (state=3): >>><<< 19285 1727203916.37537: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.63134765625, "5m": 0.41748046875, "15m": 0.20556640625}, "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2928, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 603, "free": 2928}, "nocache": {"free": 3284, "used": 247}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 502, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261788045312, "block_size": 4096, "block_total": 65519099, "block_available": 63913097, "block_used": 1606002, "inode_total": 131070960, "inode_available": 131027263, "inode_used": 43697, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["lo", "eth0", "LSR-TST-br31"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "92:1a:91:1f:57:29", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "51", "second": "56", "epoch": "1727203916", "epoch_int": "1727203916", "date": "2024-09-24", "time": "14:51:56", "iso8601_micro": "2024-09-24T18:51:56.348971Z", "iso8601": "2024-09-24T18:51:56Z", "iso8601_basic": "20240924T145156348971", "iso8601_basic_short": "20240924T145156", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_iscsi_iqn": "", "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203916.38464: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203915.5441031-20712-201722051354943/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203916.38651: _low_level_execute_command(): starting 19285 1727203916.38660: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203915.5441031-20712-201722051354943/ > /dev/null 2>&1 && sleep 0' 19285 1727203916.39690: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203916.39707: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19285 1727203916.39773: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203916.39883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203916.39908: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203916.39930: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203916.40026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203916.41988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203916.42026: stderr chunk (state=3): >>><<< 19285 1727203916.42041: stdout chunk (state=3): >>><<< 19285 1727203916.42119: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203916.42123: handler run complete 19285 1727203916.42228: variable 'ansible_facts' from source: unknown 19285 1727203916.42328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203916.42669: variable 'ansible_facts' from source: unknown 19285 1727203916.42759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203916.42894: attempt loop complete, returning result 19285 1727203916.42902: _execute() done 19285 1727203916.42907: dumping result to json 19285 1727203916.42937: done dumping result, returning 19285 1727203916.42948: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [028d2410-947f-f31b-fb3f-000000000219] 19285 1727203916.42955: sending task result for task 028d2410-947f-f31b-fb3f-000000000219 ok: [managed-node2] 19285 1727203916.43881: no more pending results, returning what we have 19285 1727203916.43884: results queue empty 19285 1727203916.43885: checking for any_errors_fatal 19285 1727203916.43886: done checking for any_errors_fatal 19285 1727203916.43887: checking for max_fail_percentage 19285 1727203916.43888: done checking for max_fail_percentage 19285 1727203916.43889: checking to see if all hosts have failed and the running result is not ok 19285 1727203916.43890: done checking to see if all hosts have failed 19285 1727203916.43891: getting the remaining hosts for this loop 19285 1727203916.43892: done getting the remaining hosts for this loop 19285 1727203916.43896: getting the next task for host managed-node2 19285 1727203916.43901: done getting next task for host managed-node2 19285 1727203916.43902: ^ task is: TASK: meta (flush_handlers) 19285 1727203916.43904: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203916.43907: getting variables 19285 1727203916.43909: in VariableManager get_vars() 19285 1727203916.43930: Calling all_inventory to load vars for managed-node2 19285 1727203916.43932: Calling groups_inventory to load vars for managed-node2 19285 1727203916.43936: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203916.43942: done sending task result for task 028d2410-947f-f31b-fb3f-000000000219 19285 1727203916.43945: WORKER PROCESS EXITING 19285 1727203916.43954: Calling all_plugins_play to load vars for managed-node2 19285 1727203916.43957: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203916.43961: Calling groups_plugins_play to load vars for managed-node2 19285 1727203916.61053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203916.63147: done with get_vars() 19285 1727203916.63182: done getting variables 19285 1727203916.63246: in VariableManager get_vars() 19285 1727203916.63257: Calling all_inventory to load vars for managed-node2 19285 1727203916.63259: Calling groups_inventory to load vars for managed-node2 19285 1727203916.63264: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203916.63270: Calling all_plugins_play to load vars for managed-node2 19285 1727203916.63272: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203916.63277: Calling groups_plugins_play to load vars for managed-node2 19285 1727203916.65247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203916.69047: done with get_vars() 19285 1727203916.69086: done queuing things up, now waiting for results queue to drain 19285 1727203916.69089: results queue empty 19285 1727203916.69089: checking for any_errors_fatal 19285 1727203916.69093: done checking for any_errors_fatal 19285 1727203916.69094: checking for max_fail_percentage 19285 1727203916.69095: done checking for max_fail_percentage 19285 1727203916.69100: checking to see if all hosts have failed and the running result is not ok 19285 1727203916.69101: done checking to see if all hosts have failed 19285 1727203916.69102: getting the remaining hosts for this loop 19285 1727203916.69103: done getting the remaining hosts for this loop 19285 1727203916.69106: getting the next task for host managed-node2 19285 1727203916.69109: done getting next task for host managed-node2 19285 1727203916.69112: ^ task is: TASK: Include the task '{{ task }}' 19285 1727203916.69113: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203916.69115: getting variables 19285 1727203916.69116: in VariableManager get_vars() 19285 1727203916.69125: Calling all_inventory to load vars for managed-node2 19285 1727203916.69127: Calling groups_inventory to load vars for managed-node2 19285 1727203916.69130: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203916.69136: Calling all_plugins_play to load vars for managed-node2 19285 1727203916.69138: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203916.69141: Calling groups_plugins_play to load vars for managed-node2 19285 1727203916.70841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203916.73445: done with get_vars() 19285 1727203916.73548: done getting variables 19285 1727203916.73726: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_device_present.yml'] ********************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Tuesday 24 September 2024 14:51:56 -0400 (0:00:01.272) 0:00:15.812 ***** 19285 1727203916.73754: entering _queue_task() for managed-node2/include_tasks 19285 1727203916.74263: worker is 1 (out of 1 available) 19285 1727203916.74477: exiting _queue_task() for managed-node2/include_tasks 19285 1727203916.74489: done queuing things up, now waiting for results queue to drain 19285 1727203916.74490: waiting for pending results... 19285 1727203916.74567: running TaskExecutor() for managed-node2/TASK: Include the task 'tasks/assert_device_present.yml' 19285 1727203916.74687: in run() - task 028d2410-947f-f31b-fb3f-00000000002d 19285 1727203916.74706: variable 'ansible_search_path' from source: unknown 19285 1727203916.74777: calling self._execute() 19285 1727203916.74878: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203916.74890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203916.74905: variable 'omit' from source: magic vars 19285 1727203916.75341: variable 'ansible_distribution_major_version' from source: facts 19285 1727203916.75367: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203916.75381: variable 'task' from source: play vars 19285 1727203916.75487: variable 'task' from source: play vars 19285 1727203916.75499: _execute() done 19285 1727203916.75507: dumping result to json 19285 1727203916.75513: done dumping result, returning 19285 1727203916.75555: done running TaskExecutor() for managed-node2/TASK: Include the task 'tasks/assert_device_present.yml' [028d2410-947f-f31b-fb3f-00000000002d] 19285 1727203916.75581: sending task result for task 028d2410-947f-f31b-fb3f-00000000002d 19285 1727203916.75710: no more pending results, returning what we have 19285 1727203916.75716: in VariableManager get_vars() 19285 1727203916.75753: Calling all_inventory to load vars for managed-node2 19285 1727203916.75756: Calling groups_inventory to load vars for managed-node2 19285 1727203916.75760: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203916.75775: Calling all_plugins_play to load vars for managed-node2 19285 1727203916.75779: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203916.75783: Calling groups_plugins_play to load vars for managed-node2 19285 1727203916.76623: done sending task result for task 028d2410-947f-f31b-fb3f-00000000002d 19285 1727203916.76626: WORKER PROCESS EXITING 19285 1727203916.78071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203916.80595: done with get_vars() 19285 1727203916.80642: variable 'ansible_search_path' from source: unknown 19285 1727203916.80663: we have included files to process 19285 1727203916.80664: generating all_blocks data 19285 1727203916.80666: done generating all_blocks data 19285 1727203916.80666: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 19285 1727203916.80668: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 19285 1727203916.80670: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 19285 1727203916.80841: in VariableManager get_vars() 19285 1727203916.80857: done with get_vars() 19285 1727203916.80969: done processing included file 19285 1727203916.80971: iterating over new_blocks loaded from include file 19285 1727203916.80972: in VariableManager get_vars() 19285 1727203916.80989: done with get_vars() 19285 1727203916.80991: filtering new block on tags 19285 1727203916.81008: done filtering new block on tags 19285 1727203916.81011: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node2 19285 1727203916.81015: extending task lists for all hosts with included blocks 19285 1727203916.81044: done extending task lists 19285 1727203916.81045: done processing included files 19285 1727203916.81046: results queue empty 19285 1727203916.81047: checking for any_errors_fatal 19285 1727203916.81048: done checking for any_errors_fatal 19285 1727203916.81049: checking for max_fail_percentage 19285 1727203916.81050: done checking for max_fail_percentage 19285 1727203916.81050: checking to see if all hosts have failed and the running result is not ok 19285 1727203916.81051: done checking to see if all hosts have failed 19285 1727203916.81052: getting the remaining hosts for this loop 19285 1727203916.81054: done getting the remaining hosts for this loop 19285 1727203916.81056: getting the next task for host managed-node2 19285 1727203916.81060: done getting next task for host managed-node2 19285 1727203916.81062: ^ task is: TASK: Include the task 'get_interface_stat.yml' 19285 1727203916.81064: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203916.81066: getting variables 19285 1727203916.81067: in VariableManager get_vars() 19285 1727203916.81077: Calling all_inventory to load vars for managed-node2 19285 1727203916.81079: Calling groups_inventory to load vars for managed-node2 19285 1727203916.81081: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203916.81087: Calling all_plugins_play to load vars for managed-node2 19285 1727203916.81089: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203916.81096: Calling groups_plugins_play to load vars for managed-node2 19285 1727203916.83589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203916.86213: done with get_vars() 19285 1727203916.86243: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:51:56 -0400 (0:00:00.125) 0:00:15.937 ***** 19285 1727203916.86323: entering _queue_task() for managed-node2/include_tasks 19285 1727203916.86796: worker is 1 (out of 1 available) 19285 1727203916.86807: exiting _queue_task() for managed-node2/include_tasks 19285 1727203916.86817: done queuing things up, now waiting for results queue to drain 19285 1727203916.86818: waiting for pending results... 19285 1727203916.86979: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 19285 1727203916.87102: in run() - task 028d2410-947f-f31b-fb3f-00000000022a 19285 1727203916.87120: variable 'ansible_search_path' from source: unknown 19285 1727203916.87153: variable 'ansible_search_path' from source: unknown 19285 1727203916.87172: calling self._execute() 19285 1727203916.87266: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203916.87278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203916.87370: variable 'omit' from source: magic vars 19285 1727203916.87679: variable 'ansible_distribution_major_version' from source: facts 19285 1727203916.87704: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203916.87714: _execute() done 19285 1727203916.87722: dumping result to json 19285 1727203916.87728: done dumping result, returning 19285 1727203916.87737: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [028d2410-947f-f31b-fb3f-00000000022a] 19285 1727203916.87746: sending task result for task 028d2410-947f-f31b-fb3f-00000000022a 19285 1727203916.87994: no more pending results, returning what we have 19285 1727203916.88000: in VariableManager get_vars() 19285 1727203916.88049: Calling all_inventory to load vars for managed-node2 19285 1727203916.88053: Calling groups_inventory to load vars for managed-node2 19285 1727203916.88057: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203916.88116: Calling all_plugins_play to load vars for managed-node2 19285 1727203916.88120: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203916.88129: Calling groups_plugins_play to load vars for managed-node2 19285 1727203916.88353: done sending task result for task 028d2410-947f-f31b-fb3f-00000000022a 19285 1727203916.88356: WORKER PROCESS EXITING 19285 1727203916.91051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203916.93201: done with get_vars() 19285 1727203916.93219: variable 'ansible_search_path' from source: unknown 19285 1727203916.93221: variable 'ansible_search_path' from source: unknown 19285 1727203916.93230: variable 'task' from source: play vars 19285 1727203916.93457: variable 'task' from source: play vars 19285 1727203916.93564: we have included files to process 19285 1727203916.93565: generating all_blocks data 19285 1727203916.93567: done generating all_blocks data 19285 1727203916.93568: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 19285 1727203916.93569: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 19285 1727203916.93572: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 19285 1727203916.93961: done processing included file 19285 1727203916.93963: iterating over new_blocks loaded from include file 19285 1727203916.93965: in VariableManager get_vars() 19285 1727203916.93991: done with get_vars() 19285 1727203916.93993: filtering new block on tags 19285 1727203916.94009: done filtering new block on tags 19285 1727203916.94011: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 19285 1727203916.94016: extending task lists for all hosts with included blocks 19285 1727203916.94120: done extending task lists 19285 1727203916.94121: done processing included files 19285 1727203916.94122: results queue empty 19285 1727203916.94122: checking for any_errors_fatal 19285 1727203916.94125: done checking for any_errors_fatal 19285 1727203916.94126: checking for max_fail_percentage 19285 1727203916.94127: done checking for max_fail_percentage 19285 1727203916.94128: checking to see if all hosts have failed and the running result is not ok 19285 1727203916.94129: done checking to see if all hosts have failed 19285 1727203916.94129: getting the remaining hosts for this loop 19285 1727203916.94130: done getting the remaining hosts for this loop 19285 1727203916.94133: getting the next task for host managed-node2 19285 1727203916.94137: done getting next task for host managed-node2 19285 1727203916.94139: ^ task is: TASK: Get stat for interface {{ interface }} 19285 1727203916.94142: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203916.94144: getting variables 19285 1727203916.94145: in VariableManager get_vars() 19285 1727203916.94158: Calling all_inventory to load vars for managed-node2 19285 1727203916.94160: Calling groups_inventory to load vars for managed-node2 19285 1727203916.94163: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203916.94168: Calling all_plugins_play to load vars for managed-node2 19285 1727203916.94171: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203916.94174: Calling groups_plugins_play to load vars for managed-node2 19285 1727203916.95291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203916.97586: done with get_vars() 19285 1727203916.97613: done getting variables 19285 1727203916.97987: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:51:56 -0400 (0:00:00.116) 0:00:16.054 ***** 19285 1727203916.98019: entering _queue_task() for managed-node2/stat 19285 1727203916.98786: worker is 1 (out of 1 available) 19285 1727203916.98799: exiting _queue_task() for managed-node2/stat 19285 1727203916.98813: done queuing things up, now waiting for results queue to drain 19285 1727203916.98814: waiting for pending results... 19285 1727203916.99217: running TaskExecutor() for managed-node2/TASK: Get stat for interface LSR-TST-br31 19285 1727203916.99380: in run() - task 028d2410-947f-f31b-fb3f-000000000235 19285 1727203916.99401: variable 'ansible_search_path' from source: unknown 19285 1727203916.99414: variable 'ansible_search_path' from source: unknown 19285 1727203916.99458: calling self._execute() 19285 1727203916.99572: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203916.99585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203916.99635: variable 'omit' from source: magic vars 19285 1727203917.00001: variable 'ansible_distribution_major_version' from source: facts 19285 1727203917.00020: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203917.00032: variable 'omit' from source: magic vars 19285 1727203917.00095: variable 'omit' from source: magic vars 19285 1727203917.00282: variable 'interface' from source: set_fact 19285 1727203917.00287: variable 'omit' from source: magic vars 19285 1727203917.00289: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203917.00312: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203917.00336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203917.00390: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203917.00393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203917.00415: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203917.00424: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203917.00432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203917.00548: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203917.00563: Set connection var ansible_pipelining to False 19285 1727203917.00606: Set connection var ansible_timeout to 10 19285 1727203917.00609: Set connection var ansible_shell_type to sh 19285 1727203917.00611: Set connection var ansible_shell_executable to /bin/sh 19285 1727203917.00618: Set connection var ansible_connection to ssh 19285 1727203917.00628: variable 'ansible_shell_executable' from source: unknown 19285 1727203917.00637: variable 'ansible_connection' from source: unknown 19285 1727203917.00643: variable 'ansible_module_compression' from source: unknown 19285 1727203917.00649: variable 'ansible_shell_type' from source: unknown 19285 1727203917.00714: variable 'ansible_shell_executable' from source: unknown 19285 1727203917.00717: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203917.00719: variable 'ansible_pipelining' from source: unknown 19285 1727203917.00725: variable 'ansible_timeout' from source: unknown 19285 1727203917.00728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203917.00908: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19285 1727203917.00926: variable 'omit' from source: magic vars 19285 1727203917.00943: starting attempt loop 19285 1727203917.00950: running the handler 19285 1727203917.00970: _low_level_execute_command(): starting 19285 1727203917.00983: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203917.02226: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203917.02230: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203917.02338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203917.02446: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203917.02585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203917.02673: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203917.04496: stdout chunk (state=3): >>>/root <<< 19285 1727203917.04624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203917.04627: stdout chunk (state=3): >>><<< 19285 1727203917.04630: stderr chunk (state=3): >>><<< 19285 1727203917.04783: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203917.04786: _low_level_execute_command(): starting 19285 1727203917.04790: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203917.0465217-20759-241103879558032 `" && echo ansible-tmp-1727203917.0465217-20759-241103879558032="` echo /root/.ansible/tmp/ansible-tmp-1727203917.0465217-20759-241103879558032 `" ) && sleep 0' 19285 1727203917.06232: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203917.06258: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203917.06282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203917.06392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203917.06474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203917.06627: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203917.06702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203917.08649: stdout chunk (state=3): >>>ansible-tmp-1727203917.0465217-20759-241103879558032=/root/.ansible/tmp/ansible-tmp-1727203917.0465217-20759-241103879558032 <<< 19285 1727203917.08749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203917.08797: stderr chunk (state=3): >>><<< 19285 1727203917.08806: stdout chunk (state=3): >>><<< 19285 1727203917.08828: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203917.0465217-20759-241103879558032=/root/.ansible/tmp/ansible-tmp-1727203917.0465217-20759-241103879558032 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203917.09173: variable 'ansible_module_compression' from source: unknown 19285 1727203917.09179: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 19285 1727203917.09309: variable 'ansible_facts' from source: unknown 19285 1727203917.09496: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203917.0465217-20759-241103879558032/AnsiballZ_stat.py 19285 1727203917.09743: Sending initial data 19285 1727203917.09746: Sent initial data (153 bytes) 19285 1727203917.10692: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203917.10745: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203917.10765: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203917.10790: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203917.10946: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203917.12832: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203917.12836: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmp2wfr8dss /root/.ansible/tmp/ansible-tmp-1727203917.0465217-20759-241103879558032/AnsiballZ_stat.py <<< 19285 1727203917.12839: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203917.0465217-20759-241103879558032/AnsiballZ_stat.py" <<< 19285 1727203917.12905: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmp2wfr8dss" to remote "/root/.ansible/tmp/ansible-tmp-1727203917.0465217-20759-241103879558032/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203917.0465217-20759-241103879558032/AnsiballZ_stat.py" <<< 19285 1727203917.14957: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203917.14961: stderr chunk (state=3): >>><<< 19285 1727203917.14963: stdout chunk (state=3): >>><<< 19285 1727203917.14981: done transferring module to remote 19285 1727203917.14997: _low_level_execute_command(): starting 19285 1727203917.15141: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203917.0465217-20759-241103879558032/ /root/.ansible/tmp/ansible-tmp-1727203917.0465217-20759-241103879558032/AnsiballZ_stat.py && sleep 0' 19285 1727203917.16207: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203917.16215: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203917.16225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203917.16239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203917.16262: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203917.16267: stderr chunk (state=3): >>>debug2: match not found <<< 19285 1727203917.16279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203917.16371: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19285 1727203917.16388: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203917.16443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203917.16526: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203917.16677: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203917.18438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203917.18681: stderr chunk (state=3): >>><<< 19285 1727203917.18685: stdout chunk (state=3): >>><<< 19285 1727203917.18688: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203917.18691: _low_level_execute_command(): starting 19285 1727203917.18693: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203917.0465217-20759-241103879558032/AnsiballZ_stat.py && sleep 0' 19285 1727203917.19916: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203917.19925: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203917.19936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203917.19951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203917.19966: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203917.19973: stderr chunk (state=3): >>>debug2: match not found <<< 19285 1727203917.19986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203917.20001: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19285 1727203917.20353: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203917.20361: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203917.20667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203917.35656: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/LSR-TST-br31", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27780, "dev": 23, "nlink": 1, "atime": 1727203913.8866475, "mtime": 1727203913.8866475, "ctime": 1727203913.8866475, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 19285 1727203917.36970: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203917.36974: stdout chunk (state=3): >>><<< 19285 1727203917.36983: stderr chunk (state=3): >>><<< 19285 1727203917.37001: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/LSR-TST-br31", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27780, "dev": 23, "nlink": 1, "atime": 1727203913.8866475, "mtime": 1727203913.8866475, "ctime": 1727203913.8866475, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203917.37059: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203917.0465217-20759-241103879558032/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203917.37069: _low_level_execute_command(): starting 19285 1727203917.37079: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203917.0465217-20759-241103879558032/ > /dev/null 2>&1 && sleep 0' 19285 1727203917.37708: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203917.37717: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203917.37728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203917.37741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203917.37754: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203917.37764: stderr chunk (state=3): >>>debug2: match not found <<< 19285 1727203917.37770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203917.37788: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19285 1727203917.37809: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 19285 1727203917.37895: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203917.37910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203917.37926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203917.38078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203917.39913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203917.39937: stderr chunk (state=3): >>><<< 19285 1727203917.39942: stdout chunk (state=3): >>><<< 19285 1727203917.39957: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203917.39974: handler run complete 19285 1727203917.40003: attempt loop complete, returning result 19285 1727203917.40006: _execute() done 19285 1727203917.40008: dumping result to json 19285 1727203917.40013: done dumping result, returning 19285 1727203917.40021: done running TaskExecutor() for managed-node2/TASK: Get stat for interface LSR-TST-br31 [028d2410-947f-f31b-fb3f-000000000235] 19285 1727203917.40025: sending task result for task 028d2410-947f-f31b-fb3f-000000000235 19285 1727203917.40125: done sending task result for task 028d2410-947f-f31b-fb3f-000000000235 19285 1727203917.40128: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727203913.8866475, "block_size": 4096, "blocks": 0, "ctime": 1727203913.8866475, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27780, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "mode": "0777", "mtime": 1727203913.8866475, "nlink": 1, "path": "/sys/class/net/LSR-TST-br31", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 19285 1727203917.40227: no more pending results, returning what we have 19285 1727203917.40231: results queue empty 19285 1727203917.40232: checking for any_errors_fatal 19285 1727203917.40234: done checking for any_errors_fatal 19285 1727203917.40234: checking for max_fail_percentage 19285 1727203917.40236: done checking for max_fail_percentage 19285 1727203917.40237: checking to see if all hosts have failed and the running result is not ok 19285 1727203917.40238: done checking to see if all hosts have failed 19285 1727203917.40238: getting the remaining hosts for this loop 19285 1727203917.40240: done getting the remaining hosts for this loop 19285 1727203917.40243: getting the next task for host managed-node2 19285 1727203917.40251: done getting next task for host managed-node2 19285 1727203917.40255: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 19285 1727203917.40257: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203917.40260: getting variables 19285 1727203917.40262: in VariableManager get_vars() 19285 1727203917.40306: Calling all_inventory to load vars for managed-node2 19285 1727203917.40309: Calling groups_inventory to load vars for managed-node2 19285 1727203917.40312: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203917.40321: Calling all_plugins_play to load vars for managed-node2 19285 1727203917.40327: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203917.40335: Calling groups_plugins_play to load vars for managed-node2 19285 1727203917.42149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203917.44547: done with get_vars() 19285 1727203917.44632: done getting variables 19285 1727203917.44703: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19285 1727203917.44971: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'LSR-TST-br31'] ******************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:51:57 -0400 (0:00:00.470) 0:00:16.524 ***** 19285 1727203917.45031: entering _queue_task() for managed-node2/assert 19285 1727203917.45609: worker is 1 (out of 1 available) 19285 1727203917.45629: exiting _queue_task() for managed-node2/assert 19285 1727203917.45889: done queuing things up, now waiting for results queue to drain 19285 1727203917.45891: waiting for pending results... 19285 1727203917.46342: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'LSR-TST-br31' 19285 1727203917.46354: in run() - task 028d2410-947f-f31b-fb3f-00000000022b 19285 1727203917.46377: variable 'ansible_search_path' from source: unknown 19285 1727203917.46386: variable 'ansible_search_path' from source: unknown 19285 1727203917.46436: calling self._execute() 19285 1727203917.46672: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203917.46678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203917.46688: variable 'omit' from source: magic vars 19285 1727203917.47180: variable 'ansible_distribution_major_version' from source: facts 19285 1727203917.47208: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203917.47233: variable 'omit' from source: magic vars 19285 1727203917.47311: variable 'omit' from source: magic vars 19285 1727203917.47424: variable 'interface' from source: set_fact 19285 1727203917.47455: variable 'omit' from source: magic vars 19285 1727203917.47506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203917.47567: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203917.47636: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203917.47639: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203917.47642: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203917.47687: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203917.47697: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203917.47705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203917.47835: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203917.47853: Set connection var ansible_pipelining to False 19285 1727203917.47893: Set connection var ansible_timeout to 10 19285 1727203917.47896: Set connection var ansible_shell_type to sh 19285 1727203917.47901: Set connection var ansible_shell_executable to /bin/sh 19285 1727203917.47911: Set connection var ansible_connection to ssh 19285 1727203917.47961: variable 'ansible_shell_executable' from source: unknown 19285 1727203917.47965: variable 'ansible_connection' from source: unknown 19285 1727203917.47967: variable 'ansible_module_compression' from source: unknown 19285 1727203917.47969: variable 'ansible_shell_type' from source: unknown 19285 1727203917.47971: variable 'ansible_shell_executable' from source: unknown 19285 1727203917.47973: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203917.47974: variable 'ansible_pipelining' from source: unknown 19285 1727203917.47978: variable 'ansible_timeout' from source: unknown 19285 1727203917.47980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203917.48181: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203917.48185: variable 'omit' from source: magic vars 19285 1727203917.48187: starting attempt loop 19285 1727203917.48219: running the handler 19285 1727203917.48358: variable 'interface_stat' from source: set_fact 19285 1727203917.48386: Evaluated conditional (interface_stat.stat.exists): True 19285 1727203917.48402: handler run complete 19285 1727203917.48480: attempt loop complete, returning result 19285 1727203917.48483: _execute() done 19285 1727203917.48485: dumping result to json 19285 1727203917.48487: done dumping result, returning 19285 1727203917.48490: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'LSR-TST-br31' [028d2410-947f-f31b-fb3f-00000000022b] 19285 1727203917.48492: sending task result for task 028d2410-947f-f31b-fb3f-00000000022b 19285 1727203917.48812: done sending task result for task 028d2410-947f-f31b-fb3f-00000000022b 19285 1727203917.48815: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 19285 1727203917.48866: no more pending results, returning what we have 19285 1727203917.48870: results queue empty 19285 1727203917.48871: checking for any_errors_fatal 19285 1727203917.48879: done checking for any_errors_fatal 19285 1727203917.48880: checking for max_fail_percentage 19285 1727203917.48882: done checking for max_fail_percentage 19285 1727203917.48883: checking to see if all hosts have failed and the running result is not ok 19285 1727203917.48883: done checking to see if all hosts have failed 19285 1727203917.48884: getting the remaining hosts for this loop 19285 1727203917.48886: done getting the remaining hosts for this loop 19285 1727203917.48890: getting the next task for host managed-node2 19285 1727203917.48897: done getting next task for host managed-node2 19285 1727203917.48899: ^ task is: TASK: meta (flush_handlers) 19285 1727203917.48901: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203917.48904: getting variables 19285 1727203917.48905: in VariableManager get_vars() 19285 1727203917.48939: Calling all_inventory to load vars for managed-node2 19285 1727203917.48942: Calling groups_inventory to load vars for managed-node2 19285 1727203917.48946: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203917.48956: Calling all_plugins_play to load vars for managed-node2 19285 1727203917.48960: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203917.48965: Calling groups_plugins_play to load vars for managed-node2 19285 1727203917.51407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203917.53092: done with get_vars() 19285 1727203917.53121: done getting variables 19285 1727203917.53207: in VariableManager get_vars() 19285 1727203917.53227: Calling all_inventory to load vars for managed-node2 19285 1727203917.53229: Calling groups_inventory to load vars for managed-node2 19285 1727203917.53234: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203917.53239: Calling all_plugins_play to load vars for managed-node2 19285 1727203917.53241: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203917.53244: Calling groups_plugins_play to load vars for managed-node2 19285 1727203917.54770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203917.56789: done with get_vars() 19285 1727203917.56819: done queuing things up, now waiting for results queue to drain 19285 1727203917.56821: results queue empty 19285 1727203917.56822: checking for any_errors_fatal 19285 1727203917.56825: done checking for any_errors_fatal 19285 1727203917.56825: checking for max_fail_percentage 19285 1727203917.56826: done checking for max_fail_percentage 19285 1727203917.56827: checking to see if all hosts have failed and the running result is not ok 19285 1727203917.56828: done checking to see if all hosts have failed 19285 1727203917.56833: getting the remaining hosts for this loop 19285 1727203917.56834: done getting the remaining hosts for this loop 19285 1727203917.56837: getting the next task for host managed-node2 19285 1727203917.56841: done getting next task for host managed-node2 19285 1727203917.56842: ^ task is: TASK: meta (flush_handlers) 19285 1727203917.56844: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203917.56848: getting variables 19285 1727203917.56849: in VariableManager get_vars() 19285 1727203917.56865: Calling all_inventory to load vars for managed-node2 19285 1727203917.56867: Calling groups_inventory to load vars for managed-node2 19285 1727203917.56870: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203917.56891: Calling all_plugins_play to load vars for managed-node2 19285 1727203917.56895: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203917.56899: Calling groups_plugins_play to load vars for managed-node2 19285 1727203917.58161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203917.59904: done with get_vars() 19285 1727203917.59920: done getting variables 19285 1727203917.59955: in VariableManager get_vars() 19285 1727203917.59964: Calling all_inventory to load vars for managed-node2 19285 1727203917.59965: Calling groups_inventory to load vars for managed-node2 19285 1727203917.59967: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203917.59970: Calling all_plugins_play to load vars for managed-node2 19285 1727203917.59972: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203917.59973: Calling groups_plugins_play to load vars for managed-node2 19285 1727203917.60756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203917.62046: done with get_vars() 19285 1727203917.62297: done queuing things up, now waiting for results queue to drain 19285 1727203917.62299: results queue empty 19285 1727203917.62300: checking for any_errors_fatal 19285 1727203917.62301: done checking for any_errors_fatal 19285 1727203917.62302: checking for max_fail_percentage 19285 1727203917.62303: done checking for max_fail_percentage 19285 1727203917.62304: checking to see if all hosts have failed and the running result is not ok 19285 1727203917.62305: done checking to see if all hosts have failed 19285 1727203917.62305: getting the remaining hosts for this loop 19285 1727203917.62306: done getting the remaining hosts for this loop 19285 1727203917.62310: getting the next task for host managed-node2 19285 1727203917.62313: done getting next task for host managed-node2 19285 1727203917.62314: ^ task is: None 19285 1727203917.62315: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203917.62316: done queuing things up, now waiting for results queue to drain 19285 1727203917.62317: results queue empty 19285 1727203917.62318: checking for any_errors_fatal 19285 1727203917.62318: done checking for any_errors_fatal 19285 1727203917.62319: checking for max_fail_percentage 19285 1727203917.62319: done checking for max_fail_percentage 19285 1727203917.62320: checking to see if all hosts have failed and the running result is not ok 19285 1727203917.62321: done checking to see if all hosts have failed 19285 1727203917.62322: getting the next task for host managed-node2 19285 1727203917.62323: done getting next task for host managed-node2 19285 1727203917.62324: ^ task is: None 19285 1727203917.62325: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203917.62370: in VariableManager get_vars() 19285 1727203917.62396: done with get_vars() 19285 1727203917.62402: in VariableManager get_vars() 19285 1727203917.62414: done with get_vars() 19285 1727203917.62418: variable 'omit' from source: magic vars 19285 1727203917.62547: variable 'task' from source: play vars 19285 1727203917.62972: in VariableManager get_vars() 19285 1727203917.62985: done with get_vars() 19285 1727203917.63006: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_profile_present.yml] *********************** 19285 1727203917.63439: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19285 1727203917.63542: getting the remaining hosts for this loop 19285 1727203917.63543: done getting the remaining hosts for this loop 19285 1727203917.63546: getting the next task for host managed-node2 19285 1727203917.63548: done getting next task for host managed-node2 19285 1727203917.63550: ^ task is: TASK: Gathering Facts 19285 1727203917.63552: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203917.63554: getting variables 19285 1727203917.63554: in VariableManager get_vars() 19285 1727203917.63565: Calling all_inventory to load vars for managed-node2 19285 1727203917.63567: Calling groups_inventory to load vars for managed-node2 19285 1727203917.63569: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203917.63574: Calling all_plugins_play to load vars for managed-node2 19285 1727203917.63578: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203917.63658: Calling groups_plugins_play to load vars for managed-node2 19285 1727203917.65491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203917.67998: done with get_vars() 19285 1727203917.68021: done getting variables 19285 1727203917.68083: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Tuesday 24 September 2024 14:51:57 -0400 (0:00:00.230) 0:00:16.755 ***** 19285 1727203917.68110: entering _queue_task() for managed-node2/gather_facts 19285 1727203917.68471: worker is 1 (out of 1 available) 19285 1727203917.68484: exiting _queue_task() for managed-node2/gather_facts 19285 1727203917.68628: done queuing things up, now waiting for results queue to drain 19285 1727203917.68630: waiting for pending results... 19285 1727203917.68957: running TaskExecutor() for managed-node2/TASK: Gathering Facts 19285 1727203917.69035: in run() - task 028d2410-947f-f31b-fb3f-00000000024e 19285 1727203917.69182: variable 'ansible_search_path' from source: unknown 19285 1727203917.69185: calling self._execute() 19285 1727203917.69240: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203917.69253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203917.69322: variable 'omit' from source: magic vars 19285 1727203917.69791: variable 'ansible_distribution_major_version' from source: facts 19285 1727203917.69810: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203917.69839: variable 'omit' from source: magic vars 19285 1727203917.69874: variable 'omit' from source: magic vars 19285 1727203917.69917: variable 'omit' from source: magic vars 19285 1727203917.69982: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203917.70044: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203917.70068: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203917.70151: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203917.70160: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203917.70163: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203917.70165: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203917.70167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203917.70280: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203917.70288: Set connection var ansible_pipelining to False 19285 1727203917.70294: Set connection var ansible_timeout to 10 19285 1727203917.70296: Set connection var ansible_shell_type to sh 19285 1727203917.70302: Set connection var ansible_shell_executable to /bin/sh 19285 1727203917.70305: Set connection var ansible_connection to ssh 19285 1727203917.70320: variable 'ansible_shell_executable' from source: unknown 19285 1727203917.70323: variable 'ansible_connection' from source: unknown 19285 1727203917.70326: variable 'ansible_module_compression' from source: unknown 19285 1727203917.70328: variable 'ansible_shell_type' from source: unknown 19285 1727203917.70330: variable 'ansible_shell_executable' from source: unknown 19285 1727203917.70332: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203917.70335: variable 'ansible_pipelining' from source: unknown 19285 1727203917.70338: variable 'ansible_timeout' from source: unknown 19285 1727203917.70342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203917.70479: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203917.70486: variable 'omit' from source: magic vars 19285 1727203917.70489: starting attempt loop 19285 1727203917.70491: running the handler 19285 1727203917.70500: variable 'ansible_facts' from source: unknown 19285 1727203917.70517: _low_level_execute_command(): starting 19285 1727203917.70523: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203917.71094: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203917.71200: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203917.71204: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203917.71215: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203917.71374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203917.73081: stdout chunk (state=3): >>>/root <<< 19285 1727203917.73302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203917.73312: stdout chunk (state=3): >>><<< 19285 1727203917.73382: stderr chunk (state=3): >>><<< 19285 1727203917.73474: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203917.73540: _low_level_execute_command(): starting 19285 1727203917.73581: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203917.7342906-20800-108409675149985 `" && echo ansible-tmp-1727203917.7342906-20800-108409675149985="` echo /root/.ansible/tmp/ansible-tmp-1727203917.7342906-20800-108409675149985 `" ) && sleep 0' 19285 1727203917.75066: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203917.75092: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203917.75130: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203917.75149: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203917.75306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203917.77247: stdout chunk (state=3): >>>ansible-tmp-1727203917.7342906-20800-108409675149985=/root/.ansible/tmp/ansible-tmp-1727203917.7342906-20800-108409675149985 <<< 19285 1727203917.77386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203917.77399: stdout chunk (state=3): >>><<< 19285 1727203917.77416: stderr chunk (state=3): >>><<< 19285 1727203917.77460: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203917.7342906-20800-108409675149985=/root/.ansible/tmp/ansible-tmp-1727203917.7342906-20800-108409675149985 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203917.77521: variable 'ansible_module_compression' from source: unknown 19285 1727203917.77581: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19285 1727203917.77656: variable 'ansible_facts' from source: unknown 19285 1727203917.77889: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203917.7342906-20800-108409675149985/AnsiballZ_setup.py 19285 1727203917.78139: Sending initial data 19285 1727203917.78142: Sent initial data (154 bytes) 19285 1727203917.78866: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203917.78874: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203917.78910: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203917.78939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203917.79045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203917.80647: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203917.80718: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203917.80791: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmp54p9uw7v /root/.ansible/tmp/ansible-tmp-1727203917.7342906-20800-108409675149985/AnsiballZ_setup.py <<< 19285 1727203917.80796: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203917.7342906-20800-108409675149985/AnsiballZ_setup.py" <<< 19285 1727203917.80881: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmp54p9uw7v" to remote "/root/.ansible/tmp/ansible-tmp-1727203917.7342906-20800-108409675149985/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203917.7342906-20800-108409675149985/AnsiballZ_setup.py" <<< 19285 1727203917.82549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203917.82608: stderr chunk (state=3): >>><<< 19285 1727203917.82614: stdout chunk (state=3): >>><<< 19285 1727203917.82628: done transferring module to remote 19285 1727203917.82654: _low_level_execute_command(): starting 19285 1727203917.82662: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203917.7342906-20800-108409675149985/ /root/.ansible/tmp/ansible-tmp-1727203917.7342906-20800-108409675149985/AnsiballZ_setup.py && sleep 0' 19285 1727203917.83162: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203917.83204: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203917.83207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203917.83210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203917.83212: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203917.83215: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 19285 1727203917.83216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203917.83219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203917.83279: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203917.83287: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203917.83289: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203917.83354: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203917.85204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203917.85247: stderr chunk (state=3): >>><<< 19285 1727203917.85251: stdout chunk (state=3): >>><<< 19285 1727203917.85278: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203917.85282: _low_level_execute_command(): starting 19285 1727203917.85285: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203917.7342906-20800-108409675149985/AnsiballZ_setup.py && sleep 0' 19285 1727203917.85996: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203917.86066: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203917.86093: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203917.86145: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203917.86273: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203918.50000: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2936, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 595, "free": 2936}, "nocache": {"free": 3292, "used": 239}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "<<< 19285 1727203918.50014: stdout chunk (state=3): >>>ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 504, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261788045312, "block_size": 4096, "block_total": 65519099, "block_available": 63913097, "block_used": 1606002, "inode_total": 131070960, "inode_available": 131027263, "inode_used": 43697, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_interfaces": ["lo", "eth0", "LSR-TST-br31"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "92:1a:91:1f:57:29", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless"<<< 19285 1727203918.50025: stdout chunk (state=3): >>>: "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_loadavg": {"1m": 0.63134765625, "5m": 0.41748046875, "15m": 0.20556640625}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "51", "second": "58", "epoch": "1727203918", "epoch_int": "1727203918", "date": "2024-09-24", "time": "14:51:58", "iso8601_micro": "2024-09-24T18:51:58.496543Z", "iso8601": "2024-09-24T18:51:58Z", "iso8601_basic": "20240924T145158496543", "iso8601_basic_short": "20240924T145158", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19285 1727203918.51985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203918.51989: stdout chunk (state=3): >>><<< 19285 1727203918.51991: stderr chunk (state=3): >>><<< 19285 1727203918.51994: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2936, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 595, "free": 2936}, "nocache": {"free": 3292, "used": 239}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 504, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261788045312, "block_size": 4096, "block_total": 65519099, "block_available": 63913097, "block_used": 1606002, "inode_total": 131070960, "inode_available": 131027263, "inode_used": 43697, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_interfaces": ["lo", "eth0", "LSR-TST-br31"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "92:1a:91:1f:57:29", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_loadavg": {"1m": 0.63134765625, "5m": 0.41748046875, "15m": 0.20556640625}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "51", "second": "58", "epoch": "1727203918", "epoch_int": "1727203918", "date": "2024-09-24", "time": "14:51:58", "iso8601_micro": "2024-09-24T18:51:58.496543Z", "iso8601": "2024-09-24T18:51:58Z", "iso8601_basic": "20240924T145158496543", "iso8601_basic_short": "20240924T145158", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203918.52452: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203917.7342906-20800-108409675149985/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203918.52489: _low_level_execute_command(): starting 19285 1727203918.52498: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203917.7342906-20800-108409675149985/ > /dev/null 2>&1 && sleep 0' 19285 1727203918.53112: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203918.53125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 19285 1727203918.53137: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203918.53146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203918.53199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203918.53211: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203918.53291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203918.55147: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203918.55150: stderr chunk (state=3): >>><<< 19285 1727203918.55152: stdout chunk (state=3): >>><<< 19285 1727203918.55170: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203918.55180: handler run complete 19285 1727203918.55401: variable 'ansible_facts' from source: unknown 19285 1727203918.55429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203918.55769: variable 'ansible_facts' from source: unknown 19285 1727203918.55865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203918.56019: attempt loop complete, returning result 19285 1727203918.56030: _execute() done 19285 1727203918.56040: dumping result to json 19285 1727203918.56088: done dumping result, returning 19285 1727203918.56109: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [028d2410-947f-f31b-fb3f-00000000024e] 19285 1727203918.56135: sending task result for task 028d2410-947f-f31b-fb3f-00000000024e 19285 1727203918.56458: done sending task result for task 028d2410-947f-f31b-fb3f-00000000024e 19285 1727203918.56463: WORKER PROCESS EXITING ok: [managed-node2] 19285 1727203918.56693: no more pending results, returning what we have 19285 1727203918.56695: results queue empty 19285 1727203918.56696: checking for any_errors_fatal 19285 1727203918.56697: done checking for any_errors_fatal 19285 1727203918.56697: checking for max_fail_percentage 19285 1727203918.56698: done checking for max_fail_percentage 19285 1727203918.56699: checking to see if all hosts have failed and the running result is not ok 19285 1727203918.56700: done checking to see if all hosts have failed 19285 1727203918.56700: getting the remaining hosts for this loop 19285 1727203918.56701: done getting the remaining hosts for this loop 19285 1727203918.56704: getting the next task for host managed-node2 19285 1727203918.56707: done getting next task for host managed-node2 19285 1727203918.56709: ^ task is: TASK: meta (flush_handlers) 19285 1727203918.56710: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203918.56712: getting variables 19285 1727203918.56713: in VariableManager get_vars() 19285 1727203918.56730: Calling all_inventory to load vars for managed-node2 19285 1727203918.56731: Calling groups_inventory to load vars for managed-node2 19285 1727203918.56733: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203918.56743: Calling all_plugins_play to load vars for managed-node2 19285 1727203918.56745: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203918.56749: Calling groups_plugins_play to load vars for managed-node2 19285 1727203918.57956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203918.59401: done with get_vars() 19285 1727203918.59416: done getting variables 19285 1727203918.59473: in VariableManager get_vars() 19285 1727203918.59482: Calling all_inventory to load vars for managed-node2 19285 1727203918.59484: Calling groups_inventory to load vars for managed-node2 19285 1727203918.59486: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203918.59489: Calling all_plugins_play to load vars for managed-node2 19285 1727203918.59491: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203918.59492: Calling groups_plugins_play to load vars for managed-node2 19285 1727203918.60125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203918.61048: done with get_vars() 19285 1727203918.61080: done queuing things up, now waiting for results queue to drain 19285 1727203918.61083: results queue empty 19285 1727203918.61084: checking for any_errors_fatal 19285 1727203918.61087: done checking for any_errors_fatal 19285 1727203918.61088: checking for max_fail_percentage 19285 1727203918.61089: done checking for max_fail_percentage 19285 1727203918.61095: checking to see if all hosts have failed and the running result is not ok 19285 1727203918.61096: done checking to see if all hosts have failed 19285 1727203918.61096: getting the remaining hosts for this loop 19285 1727203918.61097: done getting the remaining hosts for this loop 19285 1727203918.61100: getting the next task for host managed-node2 19285 1727203918.61104: done getting next task for host managed-node2 19285 1727203918.61107: ^ task is: TASK: Include the task '{{ task }}' 19285 1727203918.61108: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203918.61110: getting variables 19285 1727203918.61111: in VariableManager get_vars() 19285 1727203918.61122: Calling all_inventory to load vars for managed-node2 19285 1727203918.61124: Calling groups_inventory to load vars for managed-node2 19285 1727203918.61130: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203918.61136: Calling all_plugins_play to load vars for managed-node2 19285 1727203918.61138: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203918.61140: Calling groups_plugins_play to load vars for managed-node2 19285 1727203918.62679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203918.64323: done with get_vars() 19285 1727203918.64350: done getting variables 19285 1727203918.65077: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_profile_present.yml'] ********************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Tuesday 24 September 2024 14:51:58 -0400 (0:00:00.970) 0:00:17.726 ***** 19285 1727203918.65172: entering _queue_task() for managed-node2/include_tasks 19285 1727203918.65740: worker is 1 (out of 1 available) 19285 1727203918.65791: exiting _queue_task() for managed-node2/include_tasks 19285 1727203918.65804: done queuing things up, now waiting for results queue to drain 19285 1727203918.65806: waiting for pending results... 19285 1727203918.66058: running TaskExecutor() for managed-node2/TASK: Include the task 'tasks/assert_profile_present.yml' 19285 1727203918.66182: in run() - task 028d2410-947f-f31b-fb3f-000000000031 19285 1727203918.66187: variable 'ansible_search_path' from source: unknown 19285 1727203918.66206: calling self._execute() 19285 1727203918.66304: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203918.66310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203918.66323: variable 'omit' from source: magic vars 19285 1727203918.66587: variable 'ansible_distribution_major_version' from source: facts 19285 1727203918.66597: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203918.66604: variable 'task' from source: play vars 19285 1727203918.66655: variable 'task' from source: play vars 19285 1727203918.66665: _execute() done 19285 1727203918.66668: dumping result to json 19285 1727203918.66671: done dumping result, returning 19285 1727203918.66673: done running TaskExecutor() for managed-node2/TASK: Include the task 'tasks/assert_profile_present.yml' [028d2410-947f-f31b-fb3f-000000000031] 19285 1727203918.66682: sending task result for task 028d2410-947f-f31b-fb3f-000000000031 19285 1727203918.66765: done sending task result for task 028d2410-947f-f31b-fb3f-000000000031 19285 1727203918.66768: WORKER PROCESS EXITING 19285 1727203918.66794: no more pending results, returning what we have 19285 1727203918.66798: in VariableManager get_vars() 19285 1727203918.66830: Calling all_inventory to load vars for managed-node2 19285 1727203918.66833: Calling groups_inventory to load vars for managed-node2 19285 1727203918.66836: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203918.66849: Calling all_plugins_play to load vars for managed-node2 19285 1727203918.66852: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203918.66855: Calling groups_plugins_play to load vars for managed-node2 19285 1727203918.67664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203918.68516: done with get_vars() 19285 1727203918.68529: variable 'ansible_search_path' from source: unknown 19285 1727203918.68538: we have included files to process 19285 1727203918.68539: generating all_blocks data 19285 1727203918.68540: done generating all_blocks data 19285 1727203918.68540: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 19285 1727203918.68541: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 19285 1727203918.68543: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 19285 1727203918.68671: in VariableManager get_vars() 19285 1727203918.68684: done with get_vars() 19285 1727203918.69016: done processing included file 19285 1727203918.69019: iterating over new_blocks loaded from include file 19285 1727203918.69020: in VariableManager get_vars() 19285 1727203918.69037: done with get_vars() 19285 1727203918.69038: filtering new block on tags 19285 1727203918.69065: done filtering new block on tags 19285 1727203918.69068: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node2 19285 1727203918.69074: extending task lists for all hosts with included blocks 19285 1727203918.69108: done extending task lists 19285 1727203918.69109: done processing included files 19285 1727203918.69110: results queue empty 19285 1727203918.69111: checking for any_errors_fatal 19285 1727203918.69112: done checking for any_errors_fatal 19285 1727203918.69113: checking for max_fail_percentage 19285 1727203918.69114: done checking for max_fail_percentage 19285 1727203918.69115: checking to see if all hosts have failed and the running result is not ok 19285 1727203918.69115: done checking to see if all hosts have failed 19285 1727203918.69116: getting the remaining hosts for this loop 19285 1727203918.69117: done getting the remaining hosts for this loop 19285 1727203918.69119: getting the next task for host managed-node2 19285 1727203918.69128: done getting next task for host managed-node2 19285 1727203918.69131: ^ task is: TASK: Include the task 'get_profile_stat.yml' 19285 1727203918.69133: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203918.69135: getting variables 19285 1727203918.69139: in VariableManager get_vars() 19285 1727203918.69148: Calling all_inventory to load vars for managed-node2 19285 1727203918.69150: Calling groups_inventory to load vars for managed-node2 19285 1727203918.69152: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203918.69156: Calling all_plugins_play to load vars for managed-node2 19285 1727203918.69159: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203918.69161: Calling groups_plugins_play to load vars for managed-node2 19285 1727203918.70983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203918.72552: done with get_vars() 19285 1727203918.72572: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:51:58 -0400 (0:00:00.075) 0:00:17.801 ***** 19285 1727203918.72704: entering _queue_task() for managed-node2/include_tasks 19285 1727203918.73281: worker is 1 (out of 1 available) 19285 1727203918.73296: exiting _queue_task() for managed-node2/include_tasks 19285 1727203918.73306: done queuing things up, now waiting for results queue to drain 19285 1727203918.73308: waiting for pending results... 19285 1727203918.73799: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 19285 1727203918.73805: in run() - task 028d2410-947f-f31b-fb3f-00000000025f 19285 1727203918.73807: variable 'ansible_search_path' from source: unknown 19285 1727203918.73809: variable 'ansible_search_path' from source: unknown 19285 1727203918.73812: calling self._execute() 19285 1727203918.73839: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203918.73852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203918.73874: variable 'omit' from source: magic vars 19285 1727203918.74316: variable 'ansible_distribution_major_version' from source: facts 19285 1727203918.74355: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203918.74371: _execute() done 19285 1727203918.74384: dumping result to json 19285 1727203918.74398: done dumping result, returning 19285 1727203918.74415: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [028d2410-947f-f31b-fb3f-00000000025f] 19285 1727203918.74441: sending task result for task 028d2410-947f-f31b-fb3f-00000000025f 19285 1727203918.74585: no more pending results, returning what we have 19285 1727203918.74591: in VariableManager get_vars() 19285 1727203918.74631: Calling all_inventory to load vars for managed-node2 19285 1727203918.74635: Calling groups_inventory to load vars for managed-node2 19285 1727203918.74639: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203918.74659: Calling all_plugins_play to load vars for managed-node2 19285 1727203918.74665: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203918.74668: Calling groups_plugins_play to load vars for managed-node2 19285 1727203918.75390: done sending task result for task 028d2410-947f-f31b-fb3f-00000000025f 19285 1727203918.75394: WORKER PROCESS EXITING 19285 1727203918.76554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203918.78436: done with get_vars() 19285 1727203918.78456: variable 'ansible_search_path' from source: unknown 19285 1727203918.78458: variable 'ansible_search_path' from source: unknown 19285 1727203918.78471: variable 'task' from source: play vars 19285 1727203918.78607: variable 'task' from source: play vars 19285 1727203918.78639: we have included files to process 19285 1727203918.78641: generating all_blocks data 19285 1727203918.78643: done generating all_blocks data 19285 1727203918.78644: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 19285 1727203918.78645: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 19285 1727203918.78647: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 19285 1727203918.79851: done processing included file 19285 1727203918.79854: iterating over new_blocks loaded from include file 19285 1727203918.79856: in VariableManager get_vars() 19285 1727203918.79871: done with get_vars() 19285 1727203918.79873: filtering new block on tags 19285 1727203918.79905: done filtering new block on tags 19285 1727203918.79908: in VariableManager get_vars() 19285 1727203918.79920: done with get_vars() 19285 1727203918.79921: filtering new block on tags 19285 1727203918.79940: done filtering new block on tags 19285 1727203918.79942: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 19285 1727203918.79947: extending task lists for all hosts with included blocks 19285 1727203918.80107: done extending task lists 19285 1727203918.80108: done processing included files 19285 1727203918.80109: results queue empty 19285 1727203918.80116: checking for any_errors_fatal 19285 1727203918.80119: done checking for any_errors_fatal 19285 1727203918.80120: checking for max_fail_percentage 19285 1727203918.80121: done checking for max_fail_percentage 19285 1727203918.80122: checking to see if all hosts have failed and the running result is not ok 19285 1727203918.80123: done checking to see if all hosts have failed 19285 1727203918.80124: getting the remaining hosts for this loop 19285 1727203918.80125: done getting the remaining hosts for this loop 19285 1727203918.80127: getting the next task for host managed-node2 19285 1727203918.80131: done getting next task for host managed-node2 19285 1727203918.80134: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 19285 1727203918.80136: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203918.80138: getting variables 19285 1727203918.80139: in VariableManager get_vars() 19285 1727203918.84501: Calling all_inventory to load vars for managed-node2 19285 1727203918.84505: Calling groups_inventory to load vars for managed-node2 19285 1727203918.84508: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203918.84513: Calling all_plugins_play to load vars for managed-node2 19285 1727203918.84516: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203918.84519: Calling groups_plugins_play to load vars for managed-node2 19285 1727203918.85715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203918.87397: done with get_vars() 19285 1727203918.87419: done getting variables 19285 1727203918.87466: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:51:58 -0400 (0:00:00.147) 0:00:17.949 ***** 19285 1727203918.87499: entering _queue_task() for managed-node2/set_fact 19285 1727203918.87887: worker is 1 (out of 1 available) 19285 1727203918.87899: exiting _queue_task() for managed-node2/set_fact 19285 1727203918.87920: done queuing things up, now waiting for results queue to drain 19285 1727203918.87922: waiting for pending results... 19285 1727203918.88159: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 19285 1727203918.88269: in run() - task 028d2410-947f-f31b-fb3f-00000000026c 19285 1727203918.88283: variable 'ansible_search_path' from source: unknown 19285 1727203918.88287: variable 'ansible_search_path' from source: unknown 19285 1727203918.88321: calling self._execute() 19285 1727203918.88417: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203918.88421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203918.88434: variable 'omit' from source: magic vars 19285 1727203918.88823: variable 'ansible_distribution_major_version' from source: facts 19285 1727203918.88834: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203918.88841: variable 'omit' from source: magic vars 19285 1727203918.88889: variable 'omit' from source: magic vars 19285 1727203918.88931: variable 'omit' from source: magic vars 19285 1727203918.88981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203918.89013: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203918.89123: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203918.89127: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203918.89130: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203918.89132: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203918.89135: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203918.89137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203918.89203: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203918.89210: Set connection var ansible_pipelining to False 19285 1727203918.89216: Set connection var ansible_timeout to 10 19285 1727203918.89219: Set connection var ansible_shell_type to sh 19285 1727203918.89226: Set connection var ansible_shell_executable to /bin/sh 19285 1727203918.89234: Set connection var ansible_connection to ssh 19285 1727203918.89255: variable 'ansible_shell_executable' from source: unknown 19285 1727203918.89258: variable 'ansible_connection' from source: unknown 19285 1727203918.89264: variable 'ansible_module_compression' from source: unknown 19285 1727203918.89267: variable 'ansible_shell_type' from source: unknown 19285 1727203918.89269: variable 'ansible_shell_executable' from source: unknown 19285 1727203918.89271: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203918.89273: variable 'ansible_pipelining' from source: unknown 19285 1727203918.89277: variable 'ansible_timeout' from source: unknown 19285 1727203918.89279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203918.89417: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203918.89427: variable 'omit' from source: magic vars 19285 1727203918.89432: starting attempt loop 19285 1727203918.89435: running the handler 19285 1727203918.89454: handler run complete 19285 1727203918.89466: attempt loop complete, returning result 19285 1727203918.89469: _execute() done 19285 1727203918.89472: dumping result to json 19285 1727203918.89474: done dumping result, returning 19285 1727203918.89478: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [028d2410-947f-f31b-fb3f-00000000026c] 19285 1727203918.89557: sending task result for task 028d2410-947f-f31b-fb3f-00000000026c 19285 1727203918.89621: done sending task result for task 028d2410-947f-f31b-fb3f-00000000026c 19285 1727203918.89624: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 19285 1727203918.89709: no more pending results, returning what we have 19285 1727203918.89711: results queue empty 19285 1727203918.89712: checking for any_errors_fatal 19285 1727203918.89714: done checking for any_errors_fatal 19285 1727203918.89714: checking for max_fail_percentage 19285 1727203918.89716: done checking for max_fail_percentage 19285 1727203918.89717: checking to see if all hosts have failed and the running result is not ok 19285 1727203918.89718: done checking to see if all hosts have failed 19285 1727203918.89718: getting the remaining hosts for this loop 19285 1727203918.89720: done getting the remaining hosts for this loop 19285 1727203918.89723: getting the next task for host managed-node2 19285 1727203918.89729: done getting next task for host managed-node2 19285 1727203918.89731: ^ task is: TASK: Stat profile file 19285 1727203918.89734: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203918.89737: getting variables 19285 1727203918.89738: in VariableManager get_vars() 19285 1727203918.89763: Calling all_inventory to load vars for managed-node2 19285 1727203918.89765: Calling groups_inventory to load vars for managed-node2 19285 1727203918.89768: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203918.89778: Calling all_plugins_play to load vars for managed-node2 19285 1727203918.89780: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203918.89783: Calling groups_plugins_play to load vars for managed-node2 19285 1727203918.91265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203918.92953: done with get_vars() 19285 1727203918.92989: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:51:58 -0400 (0:00:00.055) 0:00:18.005 ***** 19285 1727203918.93081: entering _queue_task() for managed-node2/stat 19285 1727203918.93587: worker is 1 (out of 1 available) 19285 1727203918.93598: exiting _queue_task() for managed-node2/stat 19285 1727203918.93608: done queuing things up, now waiting for results queue to drain 19285 1727203918.93609: waiting for pending results... 19285 1727203918.93794: running TaskExecutor() for managed-node2/TASK: Stat profile file 19285 1727203918.93893: in run() - task 028d2410-947f-f31b-fb3f-00000000026d 19285 1727203918.93945: variable 'ansible_search_path' from source: unknown 19285 1727203918.93948: variable 'ansible_search_path' from source: unknown 19285 1727203918.93965: calling self._execute() 19285 1727203918.94070: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203918.94086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203918.94108: variable 'omit' from source: magic vars 19285 1727203918.94595: variable 'ansible_distribution_major_version' from source: facts 19285 1727203918.94599: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203918.94602: variable 'omit' from source: magic vars 19285 1727203918.94648: variable 'omit' from source: magic vars 19285 1727203918.94749: variable 'profile' from source: play vars 19285 1727203918.94754: variable 'interface' from source: set_fact 19285 1727203918.94873: variable 'interface' from source: set_fact 19285 1727203918.94879: variable 'omit' from source: magic vars 19285 1727203918.94882: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203918.94921: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203918.94939: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203918.94956: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203918.94968: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203918.95030: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203918.95033: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203918.95036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203918.95105: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203918.95113: Set connection var ansible_pipelining to False 19285 1727203918.95119: Set connection var ansible_timeout to 10 19285 1727203918.95121: Set connection var ansible_shell_type to sh 19285 1727203918.95244: Set connection var ansible_shell_executable to /bin/sh 19285 1727203918.95250: Set connection var ansible_connection to ssh 19285 1727203918.95253: variable 'ansible_shell_executable' from source: unknown 19285 1727203918.95255: variable 'ansible_connection' from source: unknown 19285 1727203918.95258: variable 'ansible_module_compression' from source: unknown 19285 1727203918.95260: variable 'ansible_shell_type' from source: unknown 19285 1727203918.95265: variable 'ansible_shell_executable' from source: unknown 19285 1727203918.95267: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203918.95270: variable 'ansible_pipelining' from source: unknown 19285 1727203918.95273: variable 'ansible_timeout' from source: unknown 19285 1727203918.95277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203918.95383: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19285 1727203918.95392: variable 'omit' from source: magic vars 19285 1727203918.95397: starting attempt loop 19285 1727203918.95400: running the handler 19285 1727203918.95417: _low_level_execute_command(): starting 19285 1727203918.95423: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203918.95950: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203918.95956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 19285 1727203918.95959: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203918.96023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203918.96025: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203918.96027: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203918.96099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203918.97799: stdout chunk (state=3): >>>/root <<< 19285 1727203918.97897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203918.97937: stderr chunk (state=3): >>><<< 19285 1727203918.97942: stdout chunk (state=3): >>><<< 19285 1727203918.97982: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203918.97993: _low_level_execute_command(): starting 19285 1727203918.97999: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203918.9798052-20852-122215849647923 `" && echo ansible-tmp-1727203918.9798052-20852-122215849647923="` echo /root/.ansible/tmp/ansible-tmp-1727203918.9798052-20852-122215849647923 `" ) && sleep 0' 19285 1727203918.98605: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 19285 1727203918.98609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203918.98620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203918.98623: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203918.98626: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203918.98643: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203918.98752: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203919.00630: stdout chunk (state=3): >>>ansible-tmp-1727203918.9798052-20852-122215849647923=/root/.ansible/tmp/ansible-tmp-1727203918.9798052-20852-122215849647923 <<< 19285 1727203919.00741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203919.00762: stderr chunk (state=3): >>><<< 19285 1727203919.00768: stdout chunk (state=3): >>><<< 19285 1727203919.00786: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203918.9798052-20852-122215849647923=/root/.ansible/tmp/ansible-tmp-1727203918.9798052-20852-122215849647923 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203919.00827: variable 'ansible_module_compression' from source: unknown 19285 1727203919.00873: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 19285 1727203919.00909: variable 'ansible_facts' from source: unknown 19285 1727203919.00958: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203918.9798052-20852-122215849647923/AnsiballZ_stat.py 19285 1727203919.01055: Sending initial data 19285 1727203919.01058: Sent initial data (153 bytes) 19285 1727203919.01461: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203919.01482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203919.01495: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203919.01548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203919.01554: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203919.01556: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203919.01622: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203919.03191: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 19285 1727203919.03197: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203919.03270: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203919.03366: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmppotzb03s /root/.ansible/tmp/ansible-tmp-1727203918.9798052-20852-122215849647923/AnsiballZ_stat.py <<< 19285 1727203919.03370: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203918.9798052-20852-122215849647923/AnsiballZ_stat.py" <<< 19285 1727203919.03437: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmppotzb03s" to remote "/root/.ansible/tmp/ansible-tmp-1727203918.9798052-20852-122215849647923/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203918.9798052-20852-122215849647923/AnsiballZ_stat.py" <<< 19285 1727203919.04100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203919.04134: stderr chunk (state=3): >>><<< 19285 1727203919.04137: stdout chunk (state=3): >>><<< 19285 1727203919.04171: done transferring module to remote 19285 1727203919.04181: _low_level_execute_command(): starting 19285 1727203919.04186: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203918.9798052-20852-122215849647923/ /root/.ansible/tmp/ansible-tmp-1727203918.9798052-20852-122215849647923/AnsiballZ_stat.py && sleep 0' 19285 1727203919.04602: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203919.04606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203919.04608: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 19285 1727203919.04613: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203919.04615: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203919.04672: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203919.04674: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203919.04742: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203919.06517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203919.06539: stderr chunk (state=3): >>><<< 19285 1727203919.06542: stdout chunk (state=3): >>><<< 19285 1727203919.06554: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203919.06557: _low_level_execute_command(): starting 19285 1727203919.06566: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203918.9798052-20852-122215849647923/AnsiballZ_stat.py && sleep 0' 19285 1727203919.06957: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203919.06995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203919.06998: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203919.07000: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203919.07003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 19285 1727203919.07004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203919.07051: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203919.07055: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203919.07137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203919.22239: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 19285 1727203919.23522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203919.23550: stderr chunk (state=3): >>><<< 19285 1727203919.23553: stdout chunk (state=3): >>><<< 19285 1727203919.23569: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203919.23595: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203918.9798052-20852-122215849647923/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203919.23604: _low_level_execute_command(): starting 19285 1727203919.23609: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203918.9798052-20852-122215849647923/ > /dev/null 2>&1 && sleep 0' 19285 1727203919.24055: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203919.24096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 19285 1727203919.24099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203919.24103: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 19285 1727203919.24105: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203919.24107: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203919.24154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203919.24158: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203919.24160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203919.24234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203919.26112: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203919.26137: stderr chunk (state=3): >>><<< 19285 1727203919.26140: stdout chunk (state=3): >>><<< 19285 1727203919.26156: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203919.26159: handler run complete 19285 1727203919.26179: attempt loop complete, returning result 19285 1727203919.26182: _execute() done 19285 1727203919.26185: dumping result to json 19285 1727203919.26187: done dumping result, returning 19285 1727203919.26194: done running TaskExecutor() for managed-node2/TASK: Stat profile file [028d2410-947f-f31b-fb3f-00000000026d] 19285 1727203919.26199: sending task result for task 028d2410-947f-f31b-fb3f-00000000026d 19285 1727203919.26290: done sending task result for task 028d2410-947f-f31b-fb3f-00000000026d 19285 1727203919.26292: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 19285 1727203919.26345: no more pending results, returning what we have 19285 1727203919.26348: results queue empty 19285 1727203919.26348: checking for any_errors_fatal 19285 1727203919.26354: done checking for any_errors_fatal 19285 1727203919.26355: checking for max_fail_percentage 19285 1727203919.26356: done checking for max_fail_percentage 19285 1727203919.26357: checking to see if all hosts have failed and the running result is not ok 19285 1727203919.26358: done checking to see if all hosts have failed 19285 1727203919.26359: getting the remaining hosts for this loop 19285 1727203919.26360: done getting the remaining hosts for this loop 19285 1727203919.26364: getting the next task for host managed-node2 19285 1727203919.26371: done getting next task for host managed-node2 19285 1727203919.26374: ^ task is: TASK: Set NM profile exist flag based on the profile files 19285 1727203919.26379: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203919.26382: getting variables 19285 1727203919.26383: in VariableManager get_vars() 19285 1727203919.26411: Calling all_inventory to load vars for managed-node2 19285 1727203919.26414: Calling groups_inventory to load vars for managed-node2 19285 1727203919.26417: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203919.26427: Calling all_plugins_play to load vars for managed-node2 19285 1727203919.26430: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203919.26432: Calling groups_plugins_play to load vars for managed-node2 19285 1727203919.27255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203919.28122: done with get_vars() 19285 1727203919.28136: done getting variables 19285 1727203919.28180: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:51:59 -0400 (0:00:00.351) 0:00:18.356 ***** 19285 1727203919.28202: entering _queue_task() for managed-node2/set_fact 19285 1727203919.28421: worker is 1 (out of 1 available) 19285 1727203919.28432: exiting _queue_task() for managed-node2/set_fact 19285 1727203919.28444: done queuing things up, now waiting for results queue to drain 19285 1727203919.28445: waiting for pending results... 19285 1727203919.28617: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 19285 1727203919.28692: in run() - task 028d2410-947f-f31b-fb3f-00000000026e 19285 1727203919.28702: variable 'ansible_search_path' from source: unknown 19285 1727203919.28705: variable 'ansible_search_path' from source: unknown 19285 1727203919.28731: calling self._execute() 19285 1727203919.28806: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203919.28810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203919.28819: variable 'omit' from source: magic vars 19285 1727203919.29099: variable 'ansible_distribution_major_version' from source: facts 19285 1727203919.29111: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203919.29195: variable 'profile_stat' from source: set_fact 19285 1727203919.29207: Evaluated conditional (profile_stat.stat.exists): False 19285 1727203919.29212: when evaluation is False, skipping this task 19285 1727203919.29216: _execute() done 19285 1727203919.29219: dumping result to json 19285 1727203919.29223: done dumping result, returning 19285 1727203919.29226: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [028d2410-947f-f31b-fb3f-00000000026e] 19285 1727203919.29228: sending task result for task 028d2410-947f-f31b-fb3f-00000000026e 19285 1727203919.29313: done sending task result for task 028d2410-947f-f31b-fb3f-00000000026e 19285 1727203919.29315: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 19285 1727203919.29385: no more pending results, returning what we have 19285 1727203919.29389: results queue empty 19285 1727203919.29390: checking for any_errors_fatal 19285 1727203919.29398: done checking for any_errors_fatal 19285 1727203919.29399: checking for max_fail_percentage 19285 1727203919.29400: done checking for max_fail_percentage 19285 1727203919.29401: checking to see if all hosts have failed and the running result is not ok 19285 1727203919.29402: done checking to see if all hosts have failed 19285 1727203919.29402: getting the remaining hosts for this loop 19285 1727203919.29404: done getting the remaining hosts for this loop 19285 1727203919.29407: getting the next task for host managed-node2 19285 1727203919.29412: done getting next task for host managed-node2 19285 1727203919.29414: ^ task is: TASK: Get NM profile info 19285 1727203919.29417: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203919.29420: getting variables 19285 1727203919.29421: in VariableManager get_vars() 19285 1727203919.29444: Calling all_inventory to load vars for managed-node2 19285 1727203919.29448: Calling groups_inventory to load vars for managed-node2 19285 1727203919.29451: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203919.29460: Calling all_plugins_play to load vars for managed-node2 19285 1727203919.29463: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203919.29465: Calling groups_plugins_play to load vars for managed-node2 19285 1727203919.30341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203919.31202: done with get_vars() 19285 1727203919.31215: done getting variables 19285 1727203919.31287: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:51:59 -0400 (0:00:00.031) 0:00:18.387 ***** 19285 1727203919.31307: entering _queue_task() for managed-node2/shell 19285 1727203919.31308: Creating lock for shell 19285 1727203919.31523: worker is 1 (out of 1 available) 19285 1727203919.31535: exiting _queue_task() for managed-node2/shell 19285 1727203919.31546: done queuing things up, now waiting for results queue to drain 19285 1727203919.31548: waiting for pending results... 19285 1727203919.31721: running TaskExecutor() for managed-node2/TASK: Get NM profile info 19285 1727203919.31786: in run() - task 028d2410-947f-f31b-fb3f-00000000026f 19285 1727203919.31800: variable 'ansible_search_path' from source: unknown 19285 1727203919.31804: variable 'ansible_search_path' from source: unknown 19285 1727203919.31829: calling self._execute() 19285 1727203919.31904: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203919.31908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203919.31917: variable 'omit' from source: magic vars 19285 1727203919.32190: variable 'ansible_distribution_major_version' from source: facts 19285 1727203919.32199: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203919.32204: variable 'omit' from source: magic vars 19285 1727203919.32242: variable 'omit' from source: magic vars 19285 1727203919.32312: variable 'profile' from source: play vars 19285 1727203919.32317: variable 'interface' from source: set_fact 19285 1727203919.32366: variable 'interface' from source: set_fact 19285 1727203919.32380: variable 'omit' from source: magic vars 19285 1727203919.32412: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203919.32440: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203919.32459: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203919.32471: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203919.32483: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203919.32506: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203919.32509: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203919.32511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203919.32588: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203919.32594: Set connection var ansible_pipelining to False 19285 1727203919.32599: Set connection var ansible_timeout to 10 19285 1727203919.32602: Set connection var ansible_shell_type to sh 19285 1727203919.32608: Set connection var ansible_shell_executable to /bin/sh 19285 1727203919.32611: Set connection var ansible_connection to ssh 19285 1727203919.32625: variable 'ansible_shell_executable' from source: unknown 19285 1727203919.32628: variable 'ansible_connection' from source: unknown 19285 1727203919.32630: variable 'ansible_module_compression' from source: unknown 19285 1727203919.32632: variable 'ansible_shell_type' from source: unknown 19285 1727203919.32635: variable 'ansible_shell_executable' from source: unknown 19285 1727203919.32637: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203919.32641: variable 'ansible_pipelining' from source: unknown 19285 1727203919.32645: variable 'ansible_timeout' from source: unknown 19285 1727203919.32647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203919.32746: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203919.32755: variable 'omit' from source: magic vars 19285 1727203919.32759: starting attempt loop 19285 1727203919.32764: running the handler 19285 1727203919.32773: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203919.32793: _low_level_execute_command(): starting 19285 1727203919.32800: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203919.33305: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203919.33309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203919.33313: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203919.33316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203919.33371: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203919.33374: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203919.33382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203919.33458: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203919.35171: stdout chunk (state=3): >>>/root <<< 19285 1727203919.35278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203919.35305: stderr chunk (state=3): >>><<< 19285 1727203919.35308: stdout chunk (state=3): >>><<< 19285 1727203919.35326: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203919.35338: _low_level_execute_command(): starting 19285 1727203919.35342: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203919.353267-20865-31939225995420 `" && echo ansible-tmp-1727203919.353267-20865-31939225995420="` echo /root/.ansible/tmp/ansible-tmp-1727203919.353267-20865-31939225995420 `" ) && sleep 0' 19285 1727203919.35752: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203919.35786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203919.35789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 19285 1727203919.35792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 19285 1727203919.35794: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203919.35796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203919.35847: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203919.35850: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203919.35857: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203919.35933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203919.37850: stdout chunk (state=3): >>>ansible-tmp-1727203919.353267-20865-31939225995420=/root/.ansible/tmp/ansible-tmp-1727203919.353267-20865-31939225995420 <<< 19285 1727203919.37956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203919.37982: stderr chunk (state=3): >>><<< 19285 1727203919.37985: stdout chunk (state=3): >>><<< 19285 1727203919.37999: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203919.353267-20865-31939225995420=/root/.ansible/tmp/ansible-tmp-1727203919.353267-20865-31939225995420 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203919.38024: variable 'ansible_module_compression' from source: unknown 19285 1727203919.38068: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 19285 1727203919.38103: variable 'ansible_facts' from source: unknown 19285 1727203919.38150: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203919.353267-20865-31939225995420/AnsiballZ_command.py 19285 1727203919.38247: Sending initial data 19285 1727203919.38251: Sent initial data (154 bytes) 19285 1727203919.38673: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203919.38682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203919.38707: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203919.38711: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203919.38713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203919.38766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203919.38772: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203919.38845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203919.40417: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 19285 1727203919.40421: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203919.40485: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203919.40566: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmp8sbx1xwd /root/.ansible/tmp/ansible-tmp-1727203919.353267-20865-31939225995420/AnsiballZ_command.py <<< 19285 1727203919.40568: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203919.353267-20865-31939225995420/AnsiballZ_command.py" <<< 19285 1727203919.40624: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmp8sbx1xwd" to remote "/root/.ansible/tmp/ansible-tmp-1727203919.353267-20865-31939225995420/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203919.353267-20865-31939225995420/AnsiballZ_command.py" <<< 19285 1727203919.41270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203919.41309: stderr chunk (state=3): >>><<< 19285 1727203919.41314: stdout chunk (state=3): >>><<< 19285 1727203919.41340: done transferring module to remote 19285 1727203919.41348: _low_level_execute_command(): starting 19285 1727203919.41353: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203919.353267-20865-31939225995420/ /root/.ansible/tmp/ansible-tmp-1727203919.353267-20865-31939225995420/AnsiballZ_command.py && sleep 0' 19285 1727203919.41754: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203919.41761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203919.41797: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203919.41801: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203919.41804: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203919.41848: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203919.41852: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203919.41928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203919.43748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203919.43770: stderr chunk (state=3): >>><<< 19285 1727203919.43773: stdout chunk (state=3): >>><<< 19285 1727203919.43787: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203919.43790: _low_level_execute_command(): starting 19285 1727203919.43797: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203919.353267-20865-31939225995420/AnsiballZ_command.py && sleep 0' 19285 1727203919.44214: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203919.44218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203919.44220: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 19285 1727203919.44222: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203919.44224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203919.44280: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203919.44286: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203919.44289: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203919.44365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203919.61347: stdout chunk (state=3): >>> {"changed": true, "stdout": "LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-24 14:51:59.595076", "end": "2024-09-24 14:51:59.612380", "delta": "0:00:00.017304", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 19285 1727203919.62967: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203919.62998: stderr chunk (state=3): >>><<< 19285 1727203919.63001: stdout chunk (state=3): >>><<< 19285 1727203919.63018: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-24 14:51:59.595076", "end": "2024-09-24 14:51:59.612380", "delta": "0:00:00.017304", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203919.63046: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203919.353267-20865-31939225995420/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203919.63057: _low_level_execute_command(): starting 19285 1727203919.63059: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203919.353267-20865-31939225995420/ > /dev/null 2>&1 && sleep 0' 19285 1727203919.63526: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203919.63529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203919.63534: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19285 1727203919.63536: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203919.63538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203919.63591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203919.63594: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203919.63678: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203919.65553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203919.65581: stderr chunk (state=3): >>><<< 19285 1727203919.65586: stdout chunk (state=3): >>><<< 19285 1727203919.65602: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203919.65608: handler run complete 19285 1727203919.65625: Evaluated conditional (False): False 19285 1727203919.65634: attempt loop complete, returning result 19285 1727203919.65636: _execute() done 19285 1727203919.65639: dumping result to json 19285 1727203919.65643: done dumping result, returning 19285 1727203919.65650: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [028d2410-947f-f31b-fb3f-00000000026f] 19285 1727203919.65655: sending task result for task 028d2410-947f-f31b-fb3f-00000000026f 19285 1727203919.65745: done sending task result for task 028d2410-947f-f31b-fb3f-00000000026f 19285 1727203919.65748: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "delta": "0:00:00.017304", "end": "2024-09-24 14:51:59.612380", "rc": 0, "start": "2024-09-24 14:51:59.595076" } STDOUT: LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection 19285 1727203919.65815: no more pending results, returning what we have 19285 1727203919.65819: results queue empty 19285 1727203919.65820: checking for any_errors_fatal 19285 1727203919.65829: done checking for any_errors_fatal 19285 1727203919.65829: checking for max_fail_percentage 19285 1727203919.65831: done checking for max_fail_percentage 19285 1727203919.65832: checking to see if all hosts have failed and the running result is not ok 19285 1727203919.65833: done checking to see if all hosts have failed 19285 1727203919.65833: getting the remaining hosts for this loop 19285 1727203919.65835: done getting the remaining hosts for this loop 19285 1727203919.65838: getting the next task for host managed-node2 19285 1727203919.65845: done getting next task for host managed-node2 19285 1727203919.65848: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 19285 1727203919.65851: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203919.65858: getting variables 19285 1727203919.65859: in VariableManager get_vars() 19285 1727203919.65897: Calling all_inventory to load vars for managed-node2 19285 1727203919.65900: Calling groups_inventory to load vars for managed-node2 19285 1727203919.65904: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203919.65914: Calling all_plugins_play to load vars for managed-node2 19285 1727203919.65916: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203919.65919: Calling groups_plugins_play to load vars for managed-node2 19285 1727203919.66732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203919.67613: done with get_vars() 19285 1727203919.67631: done getting variables 19285 1727203919.67677: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:51:59 -0400 (0:00:00.363) 0:00:18.751 ***** 19285 1727203919.67699: entering _queue_task() for managed-node2/set_fact 19285 1727203919.67935: worker is 1 (out of 1 available) 19285 1727203919.67947: exiting _queue_task() for managed-node2/set_fact 19285 1727203919.67958: done queuing things up, now waiting for results queue to drain 19285 1727203919.67960: waiting for pending results... 19285 1727203919.68130: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 19285 1727203919.68206: in run() - task 028d2410-947f-f31b-fb3f-000000000270 19285 1727203919.68217: variable 'ansible_search_path' from source: unknown 19285 1727203919.68221: variable 'ansible_search_path' from source: unknown 19285 1727203919.68248: calling self._execute() 19285 1727203919.68320: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203919.68324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203919.68333: variable 'omit' from source: magic vars 19285 1727203919.68610: variable 'ansible_distribution_major_version' from source: facts 19285 1727203919.68621: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203919.68712: variable 'nm_profile_exists' from source: set_fact 19285 1727203919.68724: Evaluated conditional (nm_profile_exists.rc == 0): True 19285 1727203919.68730: variable 'omit' from source: magic vars 19285 1727203919.68766: variable 'omit' from source: magic vars 19285 1727203919.68788: variable 'omit' from source: magic vars 19285 1727203919.68821: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203919.68846: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203919.68867: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203919.68881: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203919.68890: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203919.68913: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203919.68916: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203919.68919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203919.68993: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203919.68999: Set connection var ansible_pipelining to False 19285 1727203919.69004: Set connection var ansible_timeout to 10 19285 1727203919.69007: Set connection var ansible_shell_type to sh 19285 1727203919.69013: Set connection var ansible_shell_executable to /bin/sh 19285 1727203919.69016: Set connection var ansible_connection to ssh 19285 1727203919.69030: variable 'ansible_shell_executable' from source: unknown 19285 1727203919.69033: variable 'ansible_connection' from source: unknown 19285 1727203919.69035: variable 'ansible_module_compression' from source: unknown 19285 1727203919.69037: variable 'ansible_shell_type' from source: unknown 19285 1727203919.69040: variable 'ansible_shell_executable' from source: unknown 19285 1727203919.69042: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203919.69047: variable 'ansible_pipelining' from source: unknown 19285 1727203919.69049: variable 'ansible_timeout' from source: unknown 19285 1727203919.69053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203919.69153: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203919.69163: variable 'omit' from source: magic vars 19285 1727203919.69168: starting attempt loop 19285 1727203919.69170: running the handler 19285 1727203919.69185: handler run complete 19285 1727203919.69193: attempt loop complete, returning result 19285 1727203919.69196: _execute() done 19285 1727203919.69198: dumping result to json 19285 1727203919.69200: done dumping result, returning 19285 1727203919.69207: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [028d2410-947f-f31b-fb3f-000000000270] 19285 1727203919.69211: sending task result for task 028d2410-947f-f31b-fb3f-000000000270 19285 1727203919.69291: done sending task result for task 028d2410-947f-f31b-fb3f-000000000270 19285 1727203919.69293: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 19285 1727203919.69341: no more pending results, returning what we have 19285 1727203919.69344: results queue empty 19285 1727203919.69345: checking for any_errors_fatal 19285 1727203919.69352: done checking for any_errors_fatal 19285 1727203919.69352: checking for max_fail_percentage 19285 1727203919.69354: done checking for max_fail_percentage 19285 1727203919.69355: checking to see if all hosts have failed and the running result is not ok 19285 1727203919.69356: done checking to see if all hosts have failed 19285 1727203919.69356: getting the remaining hosts for this loop 19285 1727203919.69358: done getting the remaining hosts for this loop 19285 1727203919.69364: getting the next task for host managed-node2 19285 1727203919.69373: done getting next task for host managed-node2 19285 1727203919.69376: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 19285 1727203919.69380: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203919.69383: getting variables 19285 1727203919.69384: in VariableManager get_vars() 19285 1727203919.69409: Calling all_inventory to load vars for managed-node2 19285 1727203919.69412: Calling groups_inventory to load vars for managed-node2 19285 1727203919.69415: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203919.69423: Calling all_plugins_play to load vars for managed-node2 19285 1727203919.69425: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203919.69428: Calling groups_plugins_play to load vars for managed-node2 19285 1727203919.70295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203919.71153: done with get_vars() 19285 1727203919.71168: done getting variables 19285 1727203919.71210: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19285 1727203919.71294: variable 'profile' from source: play vars 19285 1727203919.71297: variable 'interface' from source: set_fact 19285 1727203919.71339: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-LSR-TST-br31] ******************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:51:59 -0400 (0:00:00.036) 0:00:18.788 ***** 19285 1727203919.71366: entering _queue_task() for managed-node2/command 19285 1727203919.71592: worker is 1 (out of 1 available) 19285 1727203919.71604: exiting _queue_task() for managed-node2/command 19285 1727203919.71616: done queuing things up, now waiting for results queue to drain 19285 1727203919.71617: waiting for pending results... 19285 1727203919.71791: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 19285 1727203919.71857: in run() - task 028d2410-947f-f31b-fb3f-000000000272 19285 1727203919.71873: variable 'ansible_search_path' from source: unknown 19285 1727203919.71879: variable 'ansible_search_path' from source: unknown 19285 1727203919.71904: calling self._execute() 19285 1727203919.71979: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203919.71983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203919.71992: variable 'omit' from source: magic vars 19285 1727203919.72262: variable 'ansible_distribution_major_version' from source: facts 19285 1727203919.72274: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203919.72359: variable 'profile_stat' from source: set_fact 19285 1727203919.72372: Evaluated conditional (profile_stat.stat.exists): False 19285 1727203919.72376: when evaluation is False, skipping this task 19285 1727203919.72380: _execute() done 19285 1727203919.72382: dumping result to json 19285 1727203919.72386: done dumping result, returning 19285 1727203919.72390: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 [028d2410-947f-f31b-fb3f-000000000272] 19285 1727203919.72395: sending task result for task 028d2410-947f-f31b-fb3f-000000000272 19285 1727203919.72473: done sending task result for task 028d2410-947f-f31b-fb3f-000000000272 19285 1727203919.72478: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 19285 1727203919.72546: no more pending results, returning what we have 19285 1727203919.72549: results queue empty 19285 1727203919.72550: checking for any_errors_fatal 19285 1727203919.72558: done checking for any_errors_fatal 19285 1727203919.72558: checking for max_fail_percentage 19285 1727203919.72560: done checking for max_fail_percentage 19285 1727203919.72560: checking to see if all hosts have failed and the running result is not ok 19285 1727203919.72561: done checking to see if all hosts have failed 19285 1727203919.72562: getting the remaining hosts for this loop 19285 1727203919.72563: done getting the remaining hosts for this loop 19285 1727203919.72567: getting the next task for host managed-node2 19285 1727203919.72573: done getting next task for host managed-node2 19285 1727203919.72578: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 19285 1727203919.72582: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203919.72584: getting variables 19285 1727203919.72586: in VariableManager get_vars() 19285 1727203919.72609: Calling all_inventory to load vars for managed-node2 19285 1727203919.72612: Calling groups_inventory to load vars for managed-node2 19285 1727203919.72614: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203919.72623: Calling all_plugins_play to load vars for managed-node2 19285 1727203919.72625: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203919.72628: Calling groups_plugins_play to load vars for managed-node2 19285 1727203919.73386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203919.74344: done with get_vars() 19285 1727203919.74358: done getting variables 19285 1727203919.74402: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19285 1727203919.74482: variable 'profile' from source: play vars 19285 1727203919.74485: variable 'interface' from source: set_fact 19285 1727203919.74524: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-LSR-TST-br31] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:51:59 -0400 (0:00:00.031) 0:00:18.819 ***** 19285 1727203919.74547: entering _queue_task() for managed-node2/set_fact 19285 1727203919.74778: worker is 1 (out of 1 available) 19285 1727203919.74791: exiting _queue_task() for managed-node2/set_fact 19285 1727203919.74803: done queuing things up, now waiting for results queue to drain 19285 1727203919.74804: waiting for pending results... 19285 1727203919.74974: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 19285 1727203919.75052: in run() - task 028d2410-947f-f31b-fb3f-000000000273 19285 1727203919.75062: variable 'ansible_search_path' from source: unknown 19285 1727203919.75065: variable 'ansible_search_path' from source: unknown 19285 1727203919.75098: calling self._execute() 19285 1727203919.75165: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203919.75171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203919.75184: variable 'omit' from source: magic vars 19285 1727203919.75442: variable 'ansible_distribution_major_version' from source: facts 19285 1727203919.75452: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203919.75534: variable 'profile_stat' from source: set_fact 19285 1727203919.75548: Evaluated conditional (profile_stat.stat.exists): False 19285 1727203919.75551: when evaluation is False, skipping this task 19285 1727203919.75554: _execute() done 19285 1727203919.75556: dumping result to json 19285 1727203919.75559: done dumping result, returning 19285 1727203919.75562: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 [028d2410-947f-f31b-fb3f-000000000273] 19285 1727203919.75570: sending task result for task 028d2410-947f-f31b-fb3f-000000000273 19285 1727203919.75650: done sending task result for task 028d2410-947f-f31b-fb3f-000000000273 19285 1727203919.75653: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 19285 1727203919.75716: no more pending results, returning what we have 19285 1727203919.75719: results queue empty 19285 1727203919.75720: checking for any_errors_fatal 19285 1727203919.75725: done checking for any_errors_fatal 19285 1727203919.75726: checking for max_fail_percentage 19285 1727203919.75727: done checking for max_fail_percentage 19285 1727203919.75728: checking to see if all hosts have failed and the running result is not ok 19285 1727203919.75728: done checking to see if all hosts have failed 19285 1727203919.75729: getting the remaining hosts for this loop 19285 1727203919.75730: done getting the remaining hosts for this loop 19285 1727203919.75733: getting the next task for host managed-node2 19285 1727203919.75740: done getting next task for host managed-node2 19285 1727203919.75742: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 19285 1727203919.75746: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203919.75749: getting variables 19285 1727203919.75750: in VariableManager get_vars() 19285 1727203919.75774: Calling all_inventory to load vars for managed-node2 19285 1727203919.75778: Calling groups_inventory to load vars for managed-node2 19285 1727203919.75781: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203919.75790: Calling all_plugins_play to load vars for managed-node2 19285 1727203919.75792: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203919.75794: Calling groups_plugins_play to load vars for managed-node2 19285 1727203919.76553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203919.77426: done with get_vars() 19285 1727203919.77440: done getting variables 19285 1727203919.77488: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19285 1727203919.77562: variable 'profile' from source: play vars 19285 1727203919.77565: variable 'interface' from source: set_fact 19285 1727203919.77608: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-LSR-TST-br31] *********************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:51:59 -0400 (0:00:00.030) 0:00:18.850 ***** 19285 1727203919.77629: entering _queue_task() for managed-node2/command 19285 1727203919.77847: worker is 1 (out of 1 available) 19285 1727203919.77863: exiting _queue_task() for managed-node2/command 19285 1727203919.77877: done queuing things up, now waiting for results queue to drain 19285 1727203919.77878: waiting for pending results... 19285 1727203919.78042: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 19285 1727203919.78108: in run() - task 028d2410-947f-f31b-fb3f-000000000274 19285 1727203919.78121: variable 'ansible_search_path' from source: unknown 19285 1727203919.78124: variable 'ansible_search_path' from source: unknown 19285 1727203919.78151: calling self._execute() 19285 1727203919.78221: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203919.78225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203919.78236: variable 'omit' from source: magic vars 19285 1727203919.78495: variable 'ansible_distribution_major_version' from source: facts 19285 1727203919.78505: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203919.78589: variable 'profile_stat' from source: set_fact 19285 1727203919.78600: Evaluated conditional (profile_stat.stat.exists): False 19285 1727203919.78603: when evaluation is False, skipping this task 19285 1727203919.78606: _execute() done 19285 1727203919.78608: dumping result to json 19285 1727203919.78611: done dumping result, returning 19285 1727203919.78617: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 [028d2410-947f-f31b-fb3f-000000000274] 19285 1727203919.78622: sending task result for task 028d2410-947f-f31b-fb3f-000000000274 19285 1727203919.78706: done sending task result for task 028d2410-947f-f31b-fb3f-000000000274 19285 1727203919.78709: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 19285 1727203919.78756: no more pending results, returning what we have 19285 1727203919.78759: results queue empty 19285 1727203919.78760: checking for any_errors_fatal 19285 1727203919.78770: done checking for any_errors_fatal 19285 1727203919.78771: checking for max_fail_percentage 19285 1727203919.78772: done checking for max_fail_percentage 19285 1727203919.78773: checking to see if all hosts have failed and the running result is not ok 19285 1727203919.78774: done checking to see if all hosts have failed 19285 1727203919.78777: getting the remaining hosts for this loop 19285 1727203919.78778: done getting the remaining hosts for this loop 19285 1727203919.78781: getting the next task for host managed-node2 19285 1727203919.78789: done getting next task for host managed-node2 19285 1727203919.78791: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 19285 1727203919.78794: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203919.78797: getting variables 19285 1727203919.78798: in VariableManager get_vars() 19285 1727203919.78821: Calling all_inventory to load vars for managed-node2 19285 1727203919.78824: Calling groups_inventory to load vars for managed-node2 19285 1727203919.78826: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203919.78836: Calling all_plugins_play to load vars for managed-node2 19285 1727203919.78838: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203919.78840: Calling groups_plugins_play to load vars for managed-node2 19285 1727203919.79845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203919.81331: done with get_vars() 19285 1727203919.81354: done getting variables 19285 1727203919.81414: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19285 1727203919.81526: variable 'profile' from source: play vars 19285 1727203919.81530: variable 'interface' from source: set_fact 19285 1727203919.81591: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-LSR-TST-br31] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:51:59 -0400 (0:00:00.039) 0:00:18.890 ***** 19285 1727203919.81620: entering _queue_task() for managed-node2/set_fact 19285 1727203919.81939: worker is 1 (out of 1 available) 19285 1727203919.81951: exiting _queue_task() for managed-node2/set_fact 19285 1727203919.81963: done queuing things up, now waiting for results queue to drain 19285 1727203919.81965: waiting for pending results... 19285 1727203919.82395: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 19285 1727203919.82400: in run() - task 028d2410-947f-f31b-fb3f-000000000275 19285 1727203919.82404: variable 'ansible_search_path' from source: unknown 19285 1727203919.82407: variable 'ansible_search_path' from source: unknown 19285 1727203919.82429: calling self._execute() 19285 1727203919.82534: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203919.82546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203919.82565: variable 'omit' from source: magic vars 19285 1727203919.82947: variable 'ansible_distribution_major_version' from source: facts 19285 1727203919.83035: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203919.83101: variable 'profile_stat' from source: set_fact 19285 1727203919.83121: Evaluated conditional (profile_stat.stat.exists): False 19285 1727203919.83130: when evaluation is False, skipping this task 19285 1727203919.83143: _execute() done 19285 1727203919.83152: dumping result to json 19285 1727203919.83160: done dumping result, returning 19285 1727203919.83173: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 [028d2410-947f-f31b-fb3f-000000000275] 19285 1727203919.83188: sending task result for task 028d2410-947f-f31b-fb3f-000000000275 19285 1727203919.83418: done sending task result for task 028d2410-947f-f31b-fb3f-000000000275 19285 1727203919.83422: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 19285 1727203919.83470: no more pending results, returning what we have 19285 1727203919.83474: results queue empty 19285 1727203919.83478: checking for any_errors_fatal 19285 1727203919.83484: done checking for any_errors_fatal 19285 1727203919.83485: checking for max_fail_percentage 19285 1727203919.83487: done checking for max_fail_percentage 19285 1727203919.83488: checking to see if all hosts have failed and the running result is not ok 19285 1727203919.83489: done checking to see if all hosts have failed 19285 1727203919.83489: getting the remaining hosts for this loop 19285 1727203919.83491: done getting the remaining hosts for this loop 19285 1727203919.83495: getting the next task for host managed-node2 19285 1727203919.83504: done getting next task for host managed-node2 19285 1727203919.83507: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 19285 1727203919.83511: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203919.83515: getting variables 19285 1727203919.83517: in VariableManager get_vars() 19285 1727203919.83546: Calling all_inventory to load vars for managed-node2 19285 1727203919.83549: Calling groups_inventory to load vars for managed-node2 19285 1727203919.83553: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203919.83568: Calling all_plugins_play to load vars for managed-node2 19285 1727203919.83570: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203919.83573: Calling groups_plugins_play to load vars for managed-node2 19285 1727203919.84998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203919.85965: done with get_vars() 19285 1727203919.85985: done getting variables 19285 1727203919.86029: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19285 1727203919.86115: variable 'profile' from source: play vars 19285 1727203919.86118: variable 'interface' from source: set_fact 19285 1727203919.86159: variable 'interface' from source: set_fact TASK [Assert that the profile is present - 'LSR-TST-br31'] ********************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:51:59 -0400 (0:00:00.045) 0:00:18.936 ***** 19285 1727203919.86184: entering _queue_task() for managed-node2/assert 19285 1727203919.86432: worker is 1 (out of 1 available) 19285 1727203919.86443: exiting _queue_task() for managed-node2/assert 19285 1727203919.86455: done queuing things up, now waiting for results queue to drain 19285 1727203919.86457: waiting for pending results... 19285 1727203919.86634: running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'LSR-TST-br31' 19285 1727203919.86709: in run() - task 028d2410-947f-f31b-fb3f-000000000260 19285 1727203919.86719: variable 'ansible_search_path' from source: unknown 19285 1727203919.86723: variable 'ansible_search_path' from source: unknown 19285 1727203919.86750: calling self._execute() 19285 1727203919.86822: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203919.86826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203919.86836: variable 'omit' from source: magic vars 19285 1727203919.87103: variable 'ansible_distribution_major_version' from source: facts 19285 1727203919.87114: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203919.87118: variable 'omit' from source: magic vars 19285 1727203919.87150: variable 'omit' from source: magic vars 19285 1727203919.87218: variable 'profile' from source: play vars 19285 1727203919.87222: variable 'interface' from source: set_fact 19285 1727203919.87269: variable 'interface' from source: set_fact 19285 1727203919.87285: variable 'omit' from source: magic vars 19285 1727203919.87318: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203919.87352: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203919.87366: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203919.87382: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203919.87391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203919.87414: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203919.87418: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203919.87421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203919.87495: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203919.87501: Set connection var ansible_pipelining to False 19285 1727203919.87507: Set connection var ansible_timeout to 10 19285 1727203919.87509: Set connection var ansible_shell_type to sh 19285 1727203919.87515: Set connection var ansible_shell_executable to /bin/sh 19285 1727203919.87518: Set connection var ansible_connection to ssh 19285 1727203919.87533: variable 'ansible_shell_executable' from source: unknown 19285 1727203919.87536: variable 'ansible_connection' from source: unknown 19285 1727203919.87538: variable 'ansible_module_compression' from source: unknown 19285 1727203919.87541: variable 'ansible_shell_type' from source: unknown 19285 1727203919.87543: variable 'ansible_shell_executable' from source: unknown 19285 1727203919.87545: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203919.87549: variable 'ansible_pipelining' from source: unknown 19285 1727203919.87552: variable 'ansible_timeout' from source: unknown 19285 1727203919.87556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203919.87655: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203919.87667: variable 'omit' from source: magic vars 19285 1727203919.87672: starting attempt loop 19285 1727203919.87674: running the handler 19285 1727203919.87745: variable 'lsr_net_profile_exists' from source: set_fact 19285 1727203919.87748: Evaluated conditional (lsr_net_profile_exists): True 19285 1727203919.87754: handler run complete 19285 1727203919.87767: attempt loop complete, returning result 19285 1727203919.87769: _execute() done 19285 1727203919.87772: dumping result to json 19285 1727203919.87775: done dumping result, returning 19285 1727203919.87783: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'LSR-TST-br31' [028d2410-947f-f31b-fb3f-000000000260] 19285 1727203919.87787: sending task result for task 028d2410-947f-f31b-fb3f-000000000260 19285 1727203919.87866: done sending task result for task 028d2410-947f-f31b-fb3f-000000000260 19285 1727203919.87868: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 19285 1727203919.87941: no more pending results, returning what we have 19285 1727203919.87944: results queue empty 19285 1727203919.87945: checking for any_errors_fatal 19285 1727203919.87952: done checking for any_errors_fatal 19285 1727203919.87953: checking for max_fail_percentage 19285 1727203919.87955: done checking for max_fail_percentage 19285 1727203919.87956: checking to see if all hosts have failed and the running result is not ok 19285 1727203919.87957: done checking to see if all hosts have failed 19285 1727203919.87957: getting the remaining hosts for this loop 19285 1727203919.87959: done getting the remaining hosts for this loop 19285 1727203919.87964: getting the next task for host managed-node2 19285 1727203919.87970: done getting next task for host managed-node2 19285 1727203919.87973: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 19285 1727203919.87977: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203919.87980: getting variables 19285 1727203919.87981: in VariableManager get_vars() 19285 1727203919.88005: Calling all_inventory to load vars for managed-node2 19285 1727203919.88007: Calling groups_inventory to load vars for managed-node2 19285 1727203919.88010: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203919.88019: Calling all_plugins_play to load vars for managed-node2 19285 1727203919.88021: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203919.88024: Calling groups_plugins_play to load vars for managed-node2 19285 1727203919.88913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203919.89773: done with get_vars() 19285 1727203919.89788: done getting variables 19285 1727203919.89832: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19285 1727203919.89909: variable 'profile' from source: play vars 19285 1727203919.89912: variable 'interface' from source: set_fact 19285 1727203919.89953: variable 'interface' from source: set_fact TASK [Assert that the ansible managed comment is present in 'LSR-TST-br31'] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:51:59 -0400 (0:00:00.037) 0:00:18.974 ***** 19285 1727203919.89981: entering _queue_task() for managed-node2/assert 19285 1727203919.90205: worker is 1 (out of 1 available) 19285 1727203919.90217: exiting _queue_task() for managed-node2/assert 19285 1727203919.90228: done queuing things up, now waiting for results queue to drain 19285 1727203919.90229: waiting for pending results... 19285 1727203919.90401: running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'LSR-TST-br31' 19285 1727203919.90474: in run() - task 028d2410-947f-f31b-fb3f-000000000261 19285 1727203919.90486: variable 'ansible_search_path' from source: unknown 19285 1727203919.90490: variable 'ansible_search_path' from source: unknown 19285 1727203919.90517: calling self._execute() 19285 1727203919.90591: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203919.90594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203919.90604: variable 'omit' from source: magic vars 19285 1727203919.90862: variable 'ansible_distribution_major_version' from source: facts 19285 1727203919.90874: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203919.90881: variable 'omit' from source: magic vars 19285 1727203919.90912: variable 'omit' from source: magic vars 19285 1727203919.90980: variable 'profile' from source: play vars 19285 1727203919.90984: variable 'interface' from source: set_fact 19285 1727203919.91032: variable 'interface' from source: set_fact 19285 1727203919.91047: variable 'omit' from source: magic vars 19285 1727203919.91082: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203919.91111: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203919.91129: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203919.91143: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203919.91152: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203919.91179: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203919.91182: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203919.91184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203919.91256: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203919.91261: Set connection var ansible_pipelining to False 19285 1727203919.91269: Set connection var ansible_timeout to 10 19285 1727203919.91272: Set connection var ansible_shell_type to sh 19285 1727203919.91324: Set connection var ansible_shell_executable to /bin/sh 19285 1727203919.91328: Set connection var ansible_connection to ssh 19285 1727203919.91332: variable 'ansible_shell_executable' from source: unknown 19285 1727203919.91335: variable 'ansible_connection' from source: unknown 19285 1727203919.91337: variable 'ansible_module_compression' from source: unknown 19285 1727203919.91338: variable 'ansible_shell_type' from source: unknown 19285 1727203919.91340: variable 'ansible_shell_executable' from source: unknown 19285 1727203919.91342: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203919.91344: variable 'ansible_pipelining' from source: unknown 19285 1727203919.91347: variable 'ansible_timeout' from source: unknown 19285 1727203919.91349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203919.91413: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203919.91422: variable 'omit' from source: magic vars 19285 1727203919.91428: starting attempt loop 19285 1727203919.91431: running the handler 19285 1727203919.91508: variable 'lsr_net_profile_ansible_managed' from source: set_fact 19285 1727203919.91512: Evaluated conditional (lsr_net_profile_ansible_managed): True 19285 1727203919.91517: handler run complete 19285 1727203919.91528: attempt loop complete, returning result 19285 1727203919.91531: _execute() done 19285 1727203919.91533: dumping result to json 19285 1727203919.91536: done dumping result, returning 19285 1727203919.91544: done running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'LSR-TST-br31' [028d2410-947f-f31b-fb3f-000000000261] 19285 1727203919.91547: sending task result for task 028d2410-947f-f31b-fb3f-000000000261 19285 1727203919.91623: done sending task result for task 028d2410-947f-f31b-fb3f-000000000261 19285 1727203919.91625: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 19285 1727203919.91694: no more pending results, returning what we have 19285 1727203919.91698: results queue empty 19285 1727203919.91699: checking for any_errors_fatal 19285 1727203919.91705: done checking for any_errors_fatal 19285 1727203919.91706: checking for max_fail_percentage 19285 1727203919.91708: done checking for max_fail_percentage 19285 1727203919.91709: checking to see if all hosts have failed and the running result is not ok 19285 1727203919.91710: done checking to see if all hosts have failed 19285 1727203919.91711: getting the remaining hosts for this loop 19285 1727203919.91712: done getting the remaining hosts for this loop 19285 1727203919.91715: getting the next task for host managed-node2 19285 1727203919.91721: done getting next task for host managed-node2 19285 1727203919.91724: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 19285 1727203919.91727: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203919.91730: getting variables 19285 1727203919.91731: in VariableManager get_vars() 19285 1727203919.91756: Calling all_inventory to load vars for managed-node2 19285 1727203919.91759: Calling groups_inventory to load vars for managed-node2 19285 1727203919.91762: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203919.91770: Calling all_plugins_play to load vars for managed-node2 19285 1727203919.91773: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203919.91777: Calling groups_plugins_play to load vars for managed-node2 19285 1727203919.92545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203919.93512: done with get_vars() 19285 1727203919.93527: done getting variables 19285 1727203919.93569: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19285 1727203919.93646: variable 'profile' from source: play vars 19285 1727203919.93649: variable 'interface' from source: set_fact 19285 1727203919.93693: variable 'interface' from source: set_fact TASK [Assert that the fingerprint comment is present in LSR-TST-br31] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:51:59 -0400 (0:00:00.037) 0:00:19.011 ***** 19285 1727203919.93720: entering _queue_task() for managed-node2/assert 19285 1727203919.93948: worker is 1 (out of 1 available) 19285 1727203919.93960: exiting _queue_task() for managed-node2/assert 19285 1727203919.93977: done queuing things up, now waiting for results queue to drain 19285 1727203919.93978: waiting for pending results... 19285 1727203919.94144: running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in LSR-TST-br31 19285 1727203919.94215: in run() - task 028d2410-947f-f31b-fb3f-000000000262 19285 1727203919.94226: variable 'ansible_search_path' from source: unknown 19285 1727203919.94229: variable 'ansible_search_path' from source: unknown 19285 1727203919.94257: calling self._execute() 19285 1727203919.94327: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203919.94330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203919.94340: variable 'omit' from source: magic vars 19285 1727203919.94600: variable 'ansible_distribution_major_version' from source: facts 19285 1727203919.94610: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203919.94616: variable 'omit' from source: magic vars 19285 1727203919.94643: variable 'omit' from source: magic vars 19285 1727203919.94721: variable 'profile' from source: play vars 19285 1727203919.94725: variable 'interface' from source: set_fact 19285 1727203919.94780: variable 'interface' from source: set_fact 19285 1727203919.94795: variable 'omit' from source: magic vars 19285 1727203919.94828: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203919.94857: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203919.94884: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203919.94897: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203919.94907: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203919.94931: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203919.94934: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203919.94936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203919.95013: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203919.95019: Set connection var ansible_pipelining to False 19285 1727203919.95025: Set connection var ansible_timeout to 10 19285 1727203919.95027: Set connection var ansible_shell_type to sh 19285 1727203919.95033: Set connection var ansible_shell_executable to /bin/sh 19285 1727203919.95036: Set connection var ansible_connection to ssh 19285 1727203919.95052: variable 'ansible_shell_executable' from source: unknown 19285 1727203919.95055: variable 'ansible_connection' from source: unknown 19285 1727203919.95058: variable 'ansible_module_compression' from source: unknown 19285 1727203919.95060: variable 'ansible_shell_type' from source: unknown 19285 1727203919.95062: variable 'ansible_shell_executable' from source: unknown 19285 1727203919.95068: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203919.95072: variable 'ansible_pipelining' from source: unknown 19285 1727203919.95077: variable 'ansible_timeout' from source: unknown 19285 1727203919.95079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203919.95178: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203919.95189: variable 'omit' from source: magic vars 19285 1727203919.95192: starting attempt loop 19285 1727203919.95196: running the handler 19285 1727203919.95269: variable 'lsr_net_profile_fingerprint' from source: set_fact 19285 1727203919.95272: Evaluated conditional (lsr_net_profile_fingerprint): True 19285 1727203919.95279: handler run complete 19285 1727203919.95291: attempt loop complete, returning result 19285 1727203919.95293: _execute() done 19285 1727203919.95296: dumping result to json 19285 1727203919.95298: done dumping result, returning 19285 1727203919.95306: done running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in LSR-TST-br31 [028d2410-947f-f31b-fb3f-000000000262] 19285 1727203919.95309: sending task result for task 028d2410-947f-f31b-fb3f-000000000262 19285 1727203919.95388: done sending task result for task 028d2410-947f-f31b-fb3f-000000000262 19285 1727203919.95390: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 19285 1727203919.95468: no more pending results, returning what we have 19285 1727203919.95472: results queue empty 19285 1727203919.95473: checking for any_errors_fatal 19285 1727203919.95481: done checking for any_errors_fatal 19285 1727203919.95482: checking for max_fail_percentage 19285 1727203919.95483: done checking for max_fail_percentage 19285 1727203919.95485: checking to see if all hosts have failed and the running result is not ok 19285 1727203919.95486: done checking to see if all hosts have failed 19285 1727203919.95486: getting the remaining hosts for this loop 19285 1727203919.95488: done getting the remaining hosts for this loop 19285 1727203919.95491: getting the next task for host managed-node2 19285 1727203919.95499: done getting next task for host managed-node2 19285 1727203919.95501: ^ task is: TASK: meta (flush_handlers) 19285 1727203919.95503: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203919.95506: getting variables 19285 1727203919.95508: in VariableManager get_vars() 19285 1727203919.95534: Calling all_inventory to load vars for managed-node2 19285 1727203919.95536: Calling groups_inventory to load vars for managed-node2 19285 1727203919.95540: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203919.95550: Calling all_plugins_play to load vars for managed-node2 19285 1727203919.95552: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203919.95555: Calling groups_plugins_play to load vars for managed-node2 19285 1727203919.96337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203919.97202: done with get_vars() 19285 1727203919.97218: done getting variables 19285 1727203919.97267: in VariableManager get_vars() 19285 1727203919.97273: Calling all_inventory to load vars for managed-node2 19285 1727203919.97274: Calling groups_inventory to load vars for managed-node2 19285 1727203919.97277: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203919.97281: Calling all_plugins_play to load vars for managed-node2 19285 1727203919.97282: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203919.97284: Calling groups_plugins_play to load vars for managed-node2 19285 1727203919.97990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203919.98841: done with get_vars() 19285 1727203919.98860: done queuing things up, now waiting for results queue to drain 19285 1727203919.98863: results queue empty 19285 1727203919.98864: checking for any_errors_fatal 19285 1727203919.98866: done checking for any_errors_fatal 19285 1727203919.98866: checking for max_fail_percentage 19285 1727203919.98867: done checking for max_fail_percentage 19285 1727203919.98872: checking to see if all hosts have failed and the running result is not ok 19285 1727203919.98873: done checking to see if all hosts have failed 19285 1727203919.98873: getting the remaining hosts for this loop 19285 1727203919.98874: done getting the remaining hosts for this loop 19285 1727203919.98878: getting the next task for host managed-node2 19285 1727203919.98880: done getting next task for host managed-node2 19285 1727203919.98881: ^ task is: TASK: meta (flush_handlers) 19285 1727203919.98882: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203919.98884: getting variables 19285 1727203919.98884: in VariableManager get_vars() 19285 1727203919.98890: Calling all_inventory to load vars for managed-node2 19285 1727203919.98891: Calling groups_inventory to load vars for managed-node2 19285 1727203919.98893: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203919.98896: Calling all_plugins_play to load vars for managed-node2 19285 1727203919.98898: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203919.98899: Calling groups_plugins_play to load vars for managed-node2 19285 1727203919.99537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203920.00402: done with get_vars() 19285 1727203920.00426: done getting variables 19285 1727203920.00471: in VariableManager get_vars() 19285 1727203920.00482: Calling all_inventory to load vars for managed-node2 19285 1727203920.00484: Calling groups_inventory to load vars for managed-node2 19285 1727203920.00486: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203920.00490: Calling all_plugins_play to load vars for managed-node2 19285 1727203920.00491: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203920.00493: Calling groups_plugins_play to load vars for managed-node2 19285 1727203920.01212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203920.02080: done with get_vars() 19285 1727203920.02103: done queuing things up, now waiting for results queue to drain 19285 1727203920.02105: results queue empty 19285 1727203920.02105: checking for any_errors_fatal 19285 1727203920.02106: done checking for any_errors_fatal 19285 1727203920.02107: checking for max_fail_percentage 19285 1727203920.02108: done checking for max_fail_percentage 19285 1727203920.02108: checking to see if all hosts have failed and the running result is not ok 19285 1727203920.02109: done checking to see if all hosts have failed 19285 1727203920.02109: getting the remaining hosts for this loop 19285 1727203920.02110: done getting the remaining hosts for this loop 19285 1727203920.02112: getting the next task for host managed-node2 19285 1727203920.02114: done getting next task for host managed-node2 19285 1727203920.02115: ^ task is: None 19285 1727203920.02116: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203920.02117: done queuing things up, now waiting for results queue to drain 19285 1727203920.02117: results queue empty 19285 1727203920.02118: checking for any_errors_fatal 19285 1727203920.02118: done checking for any_errors_fatal 19285 1727203920.02118: checking for max_fail_percentage 19285 1727203920.02119: done checking for max_fail_percentage 19285 1727203920.02119: checking to see if all hosts have failed and the running result is not ok 19285 1727203920.02120: done checking to see if all hosts have failed 19285 1727203920.02121: getting the next task for host managed-node2 19285 1727203920.02122: done getting next task for host managed-node2 19285 1727203920.02122: ^ task is: None 19285 1727203920.02123: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203920.02166: in VariableManager get_vars() 19285 1727203920.02186: done with get_vars() 19285 1727203920.02190: in VariableManager get_vars() 19285 1727203920.02201: done with get_vars() 19285 1727203920.02204: variable 'omit' from source: magic vars 19285 1727203920.02288: variable 'profile' from source: play vars 19285 1727203920.02374: in VariableManager get_vars() 19285 1727203920.02386: done with get_vars() 19285 1727203920.02401: variable 'omit' from source: magic vars 19285 1727203920.02446: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 19285 1727203920.02901: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19285 1727203920.02923: getting the remaining hosts for this loop 19285 1727203920.02924: done getting the remaining hosts for this loop 19285 1727203920.02926: getting the next task for host managed-node2 19285 1727203920.02928: done getting next task for host managed-node2 19285 1727203920.02929: ^ task is: TASK: Gathering Facts 19285 1727203920.02930: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203920.02931: getting variables 19285 1727203920.02932: in VariableManager get_vars() 19285 1727203920.02940: Calling all_inventory to load vars for managed-node2 19285 1727203920.02941: Calling groups_inventory to load vars for managed-node2 19285 1727203920.02942: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203920.02946: Calling all_plugins_play to load vars for managed-node2 19285 1727203920.02948: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203920.02949: Calling groups_plugins_play to load vars for managed-node2 19285 1727203920.03621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203920.04477: done with get_vars() 19285 1727203920.04496: done getting variables 19285 1727203920.04527: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Tuesday 24 September 2024 14:52:00 -0400 (0:00:00.108) 0:00:19.119 ***** 19285 1727203920.04545: entering _queue_task() for managed-node2/gather_facts 19285 1727203920.04803: worker is 1 (out of 1 available) 19285 1727203920.04814: exiting _queue_task() for managed-node2/gather_facts 19285 1727203920.04826: done queuing things up, now waiting for results queue to drain 19285 1727203920.04827: waiting for pending results... 19285 1727203920.04994: running TaskExecutor() for managed-node2/TASK: Gathering Facts 19285 1727203920.05067: in run() - task 028d2410-947f-f31b-fb3f-0000000002b5 19285 1727203920.05072: variable 'ansible_search_path' from source: unknown 19285 1727203920.05100: calling self._execute() 19285 1727203920.05177: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203920.05181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203920.05191: variable 'omit' from source: magic vars 19285 1727203920.05455: variable 'ansible_distribution_major_version' from source: facts 19285 1727203920.05480: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203920.05485: variable 'omit' from source: magic vars 19285 1727203920.05488: variable 'omit' from source: magic vars 19285 1727203920.05516: variable 'omit' from source: magic vars 19285 1727203920.05549: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203920.05578: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203920.05595: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203920.05611: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203920.05622: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203920.05646: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203920.05649: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203920.05651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203920.05726: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203920.05732: Set connection var ansible_pipelining to False 19285 1727203920.05738: Set connection var ansible_timeout to 10 19285 1727203920.05741: Set connection var ansible_shell_type to sh 19285 1727203920.05747: Set connection var ansible_shell_executable to /bin/sh 19285 1727203920.05749: Set connection var ansible_connection to ssh 19285 1727203920.05766: variable 'ansible_shell_executable' from source: unknown 19285 1727203920.05769: variable 'ansible_connection' from source: unknown 19285 1727203920.05771: variable 'ansible_module_compression' from source: unknown 19285 1727203920.05774: variable 'ansible_shell_type' from source: unknown 19285 1727203920.05778: variable 'ansible_shell_executable' from source: unknown 19285 1727203920.05780: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203920.05831: variable 'ansible_pipelining' from source: unknown 19285 1727203920.05834: variable 'ansible_timeout' from source: unknown 19285 1727203920.05836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203920.05925: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203920.05940: variable 'omit' from source: magic vars 19285 1727203920.05943: starting attempt loop 19285 1727203920.05946: running the handler 19285 1727203920.05954: variable 'ansible_facts' from source: unknown 19285 1727203920.05972: _low_level_execute_command(): starting 19285 1727203920.05980: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203920.06523: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203920.06527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 19285 1727203920.06531: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 19285 1727203920.06533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203920.06570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203920.06573: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203920.06588: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203920.06673: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203920.08389: stdout chunk (state=3): >>>/root <<< 19285 1727203920.08492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203920.08524: stderr chunk (state=3): >>><<< 19285 1727203920.08528: stdout chunk (state=3): >>><<< 19285 1727203920.08549: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203920.08567: _low_level_execute_command(): starting 19285 1727203920.08571: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203920.0854883-20889-22804043538663 `" && echo ansible-tmp-1727203920.0854883-20889-22804043538663="` echo /root/.ansible/tmp/ansible-tmp-1727203920.0854883-20889-22804043538663 `" ) && sleep 0' 19285 1727203920.09037: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203920.09041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203920.09052: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 19285 1727203920.09055: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203920.09057: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203920.09108: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203920.09116: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203920.09119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203920.09188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203920.11131: stdout chunk (state=3): >>>ansible-tmp-1727203920.0854883-20889-22804043538663=/root/.ansible/tmp/ansible-tmp-1727203920.0854883-20889-22804043538663 <<< 19285 1727203920.11234: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203920.11266: stderr chunk (state=3): >>><<< 19285 1727203920.11269: stdout chunk (state=3): >>><<< 19285 1727203920.11289: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203920.0854883-20889-22804043538663=/root/.ansible/tmp/ansible-tmp-1727203920.0854883-20889-22804043538663 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203920.11314: variable 'ansible_module_compression' from source: unknown 19285 1727203920.11354: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19285 1727203920.11410: variable 'ansible_facts' from source: unknown 19285 1727203920.11544: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203920.0854883-20889-22804043538663/AnsiballZ_setup.py 19285 1727203920.11651: Sending initial data 19285 1727203920.11654: Sent initial data (153 bytes) 19285 1727203920.12113: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203920.12116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 19285 1727203920.12120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203920.12123: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203920.12125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203920.12179: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203920.12182: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203920.12264: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203920.13959: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203920.14035: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203920.14086: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmpfufx2a1w /root/.ansible/tmp/ansible-tmp-1727203920.0854883-20889-22804043538663/AnsiballZ_setup.py <<< 19285 1727203920.14090: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203920.0854883-20889-22804043538663/AnsiballZ_setup.py" <<< 19285 1727203920.14185: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmpfufx2a1w" to remote "/root/.ansible/tmp/ansible-tmp-1727203920.0854883-20889-22804043538663/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203920.0854883-20889-22804043538663/AnsiballZ_setup.py" <<< 19285 1727203920.16059: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203920.16201: stderr chunk (state=3): >>><<< 19285 1727203920.16205: stdout chunk (state=3): >>><<< 19285 1727203920.16207: done transferring module to remote 19285 1727203920.16209: _low_level_execute_command(): starting 19285 1727203920.16212: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203920.0854883-20889-22804043538663/ /root/.ansible/tmp/ansible-tmp-1727203920.0854883-20889-22804043538663/AnsiballZ_setup.py && sleep 0' 19285 1727203920.16742: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203920.16756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 19285 1727203920.16775: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203920.16823: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203920.16842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203920.16907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203920.18842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203920.18846: stdout chunk (state=3): >>><<< 19285 1727203920.18848: stderr chunk (state=3): >>><<< 19285 1727203920.18864: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203920.18880: _low_level_execute_command(): starting 19285 1727203920.18956: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203920.0854883-20889-22804043538663/AnsiballZ_setup.py && sleep 0' 19285 1727203920.19568: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203920.19691: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203920.19694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203920.19740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203920.19757: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203920.19779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203920.19888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203920.83564: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_is_chroot": false, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.58056640625, "5m": 0.41015625, "15m": 0.2041015625}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2920, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 611, "free": 2920}, "nocache": {"free": 3276, "used": 255}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 506, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261787840512, "block_size": 4096, "block_total": 65519099, "block_available": 63913047, "block_used": 1606052, "inode_total": 131070960, "inode_available": 131027263, "inode_used": 43697, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_fibre_channel_wwn": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "52", "second": "00", "epoch": "1727203920", "epoch_int": "1727203920", "date": "2024-09-24", "time": "14:52:00", "iso8601_micro": "2024-09-24T18:52:00.782358Z", "iso8601": "2024-09-24T18:52:00Z", "iso8601_basic": "20240924T145200782358", "iso8601_basic_short": "20240924T145200", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["eth0", "lo", "LSR-TST-br31"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "92:1a:91:1f:57:29", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19285 1727203920.85685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203920.85690: stdout chunk (state=3): >>><<< 19285 1727203920.85693: stderr chunk (state=3): >>><<< 19285 1727203920.85697: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_is_chroot": false, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.58056640625, "5m": 0.41015625, "15m": 0.2041015625}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2920, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 611, "free": 2920}, "nocache": {"free": 3276, "used": 255}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 506, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261787840512, "block_size": 4096, "block_total": 65519099, "block_available": 63913047, "block_used": 1606052, "inode_total": 131070960, "inode_available": 131027263, "inode_used": 43697, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_fibre_channel_wwn": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "52", "second": "00", "epoch": "1727203920", "epoch_int": "1727203920", "date": "2024-09-24", "time": "14:52:00", "iso8601_micro": "2024-09-24T18:52:00.782358Z", "iso8601": "2024-09-24T18:52:00Z", "iso8601_basic": "20240924T145200782358", "iso8601_basic_short": "20240924T145200", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["eth0", "lo", "LSR-TST-br31"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "92:1a:91:1f:57:29", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203920.86275: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203920.0854883-20889-22804043538663/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203920.86311: _low_level_execute_command(): starting 19285 1727203920.86324: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203920.0854883-20889-22804043538663/ > /dev/null 2>&1 && sleep 0' 19285 1727203920.87792: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203920.88073: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203920.88247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203920.88423: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203920.88512: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203920.88615: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203920.88751: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203920.91011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203920.91017: stdout chunk (state=3): >>><<< 19285 1727203920.91020: stderr chunk (state=3): >>><<< 19285 1727203920.91022: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203920.91025: handler run complete 19285 1727203920.91338: variable 'ansible_facts' from source: unknown 19285 1727203920.91586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203920.92402: variable 'ansible_facts' from source: unknown 19285 1727203920.92672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203920.93007: attempt loop complete, returning result 19285 1727203920.93017: _execute() done 19285 1727203920.93027: dumping result to json 19285 1727203920.93068: done dumping result, returning 19285 1727203920.93085: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [028d2410-947f-f31b-fb3f-0000000002b5] 19285 1727203920.93099: sending task result for task 028d2410-947f-f31b-fb3f-0000000002b5 19285 1727203920.94623: done sending task result for task 028d2410-947f-f31b-fb3f-0000000002b5 19285 1727203920.94627: WORKER PROCESS EXITING ok: [managed-node2] 19285 1727203920.95113: no more pending results, returning what we have 19285 1727203920.95117: results queue empty 19285 1727203920.95118: checking for any_errors_fatal 19285 1727203920.95119: done checking for any_errors_fatal 19285 1727203920.95120: checking for max_fail_percentage 19285 1727203920.95122: done checking for max_fail_percentage 19285 1727203920.95123: checking to see if all hosts have failed and the running result is not ok 19285 1727203920.95124: done checking to see if all hosts have failed 19285 1727203920.95125: getting the remaining hosts for this loop 19285 1727203920.95126: done getting the remaining hosts for this loop 19285 1727203920.95130: getting the next task for host managed-node2 19285 1727203920.95135: done getting next task for host managed-node2 19285 1727203920.95137: ^ task is: TASK: meta (flush_handlers) 19285 1727203920.95139: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203920.95143: getting variables 19285 1727203920.95144: in VariableManager get_vars() 19285 1727203920.95185: Calling all_inventory to load vars for managed-node2 19285 1727203920.95188: Calling groups_inventory to load vars for managed-node2 19285 1727203920.95191: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203920.95201: Calling all_plugins_play to load vars for managed-node2 19285 1727203920.95203: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203920.95206: Calling groups_plugins_play to load vars for managed-node2 19285 1727203920.96712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203920.98713: done with get_vars() 19285 1727203920.98789: done getting variables 19285 1727203920.98986: in VariableManager get_vars() 19285 1727203920.99002: Calling all_inventory to load vars for managed-node2 19285 1727203920.99004: Calling groups_inventory to load vars for managed-node2 19285 1727203920.99007: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203920.99012: Calling all_plugins_play to load vars for managed-node2 19285 1727203920.99015: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203920.99018: Calling groups_plugins_play to load vars for managed-node2 19285 1727203921.14468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203921.18568: done with get_vars() 19285 1727203921.18723: done queuing things up, now waiting for results queue to drain 19285 1727203921.18726: results queue empty 19285 1727203921.18727: checking for any_errors_fatal 19285 1727203921.18731: done checking for any_errors_fatal 19285 1727203921.18732: checking for max_fail_percentage 19285 1727203921.18733: done checking for max_fail_percentage 19285 1727203921.18738: checking to see if all hosts have failed and the running result is not ok 19285 1727203921.18739: done checking to see if all hosts have failed 19285 1727203921.18740: getting the remaining hosts for this loop 19285 1727203921.18741: done getting the remaining hosts for this loop 19285 1727203921.18744: getting the next task for host managed-node2 19285 1727203921.18748: done getting next task for host managed-node2 19285 1727203921.18752: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 19285 1727203921.18754: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203921.18764: getting variables 19285 1727203921.18766: in VariableManager get_vars() 19285 1727203921.18929: Calling all_inventory to load vars for managed-node2 19285 1727203921.18933: Calling groups_inventory to load vars for managed-node2 19285 1727203921.18935: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203921.18941: Calling all_plugins_play to load vars for managed-node2 19285 1727203921.18943: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203921.18946: Calling groups_plugins_play to load vars for managed-node2 19285 1727203921.21648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203921.24709: done with get_vars() 19285 1727203921.24753: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:52:01 -0400 (0:00:01.203) 0:00:20.323 ***** 19285 1727203921.24893: entering _queue_task() for managed-node2/include_tasks 19285 1727203921.25452: worker is 1 (out of 1 available) 19285 1727203921.25466: exiting _queue_task() for managed-node2/include_tasks 19285 1727203921.25480: done queuing things up, now waiting for results queue to drain 19285 1727203921.25482: waiting for pending results... 19285 1727203921.26381: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 19285 1727203921.26503: in run() - task 028d2410-947f-f31b-fb3f-00000000003a 19285 1727203921.26508: variable 'ansible_search_path' from source: unknown 19285 1727203921.26511: variable 'ansible_search_path' from source: unknown 19285 1727203921.26585: calling self._execute() 19285 1727203921.26662: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203921.26677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203921.26702: variable 'omit' from source: magic vars 19285 1727203921.27115: variable 'ansible_distribution_major_version' from source: facts 19285 1727203921.27152: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203921.27155: _execute() done 19285 1727203921.27239: dumping result to json 19285 1727203921.27242: done dumping result, returning 19285 1727203921.27245: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [028d2410-947f-f31b-fb3f-00000000003a] 19285 1727203921.27247: sending task result for task 028d2410-947f-f31b-fb3f-00000000003a 19285 1727203921.27335: done sending task result for task 028d2410-947f-f31b-fb3f-00000000003a 19285 1727203921.27338: WORKER PROCESS EXITING 19285 1727203921.27385: no more pending results, returning what we have 19285 1727203921.27391: in VariableManager get_vars() 19285 1727203921.27438: Calling all_inventory to load vars for managed-node2 19285 1727203921.27442: Calling groups_inventory to load vars for managed-node2 19285 1727203921.27445: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203921.27461: Calling all_plugins_play to load vars for managed-node2 19285 1727203921.27465: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203921.27468: Calling groups_plugins_play to load vars for managed-node2 19285 1727203921.30705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203921.32412: done with get_vars() 19285 1727203921.32439: variable 'ansible_search_path' from source: unknown 19285 1727203921.32441: variable 'ansible_search_path' from source: unknown 19285 1727203921.32484: we have included files to process 19285 1727203921.32485: generating all_blocks data 19285 1727203921.32487: done generating all_blocks data 19285 1727203921.32487: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 19285 1727203921.32489: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 19285 1727203921.32492: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 19285 1727203921.33083: done processing included file 19285 1727203921.33086: iterating over new_blocks loaded from include file 19285 1727203921.33087: in VariableManager get_vars() 19285 1727203921.33108: done with get_vars() 19285 1727203921.33110: filtering new block on tags 19285 1727203921.33133: done filtering new block on tags 19285 1727203921.33136: in VariableManager get_vars() 19285 1727203921.33154: done with get_vars() 19285 1727203921.33155: filtering new block on tags 19285 1727203921.33173: done filtering new block on tags 19285 1727203921.33177: in VariableManager get_vars() 19285 1727203921.33195: done with get_vars() 19285 1727203921.33197: filtering new block on tags 19285 1727203921.33212: done filtering new block on tags 19285 1727203921.33214: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 19285 1727203921.33219: extending task lists for all hosts with included blocks 19285 1727203921.33615: done extending task lists 19285 1727203921.33616: done processing included files 19285 1727203921.33617: results queue empty 19285 1727203921.33618: checking for any_errors_fatal 19285 1727203921.33620: done checking for any_errors_fatal 19285 1727203921.33620: checking for max_fail_percentage 19285 1727203921.33622: done checking for max_fail_percentage 19285 1727203921.33622: checking to see if all hosts have failed and the running result is not ok 19285 1727203921.33623: done checking to see if all hosts have failed 19285 1727203921.33624: getting the remaining hosts for this loop 19285 1727203921.33625: done getting the remaining hosts for this loop 19285 1727203921.33627: getting the next task for host managed-node2 19285 1727203921.33631: done getting next task for host managed-node2 19285 1727203921.33634: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 19285 1727203921.33636: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203921.33646: getting variables 19285 1727203921.33647: in VariableManager get_vars() 19285 1727203921.33668: Calling all_inventory to load vars for managed-node2 19285 1727203921.33671: Calling groups_inventory to load vars for managed-node2 19285 1727203921.33673: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203921.33680: Calling all_plugins_play to load vars for managed-node2 19285 1727203921.33683: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203921.33686: Calling groups_plugins_play to load vars for managed-node2 19285 1727203921.34950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203921.36671: done with get_vars() 19285 1727203921.36702: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:52:01 -0400 (0:00:00.119) 0:00:20.442 ***** 19285 1727203921.36774: entering _queue_task() for managed-node2/setup 19285 1727203921.37607: worker is 1 (out of 1 available) 19285 1727203921.37735: exiting _queue_task() for managed-node2/setup 19285 1727203921.37765: done queuing things up, now waiting for results queue to drain 19285 1727203921.37767: waiting for pending results... 19285 1727203921.38403: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 19285 1727203921.38410: in run() - task 028d2410-947f-f31b-fb3f-0000000002f6 19285 1727203921.38413: variable 'ansible_search_path' from source: unknown 19285 1727203921.38416: variable 'ansible_search_path' from source: unknown 19285 1727203921.38419: calling self._execute() 19285 1727203921.38422: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203921.38424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203921.38428: variable 'omit' from source: magic vars 19285 1727203921.38762: variable 'ansible_distribution_major_version' from source: facts 19285 1727203921.38767: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203921.39086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203921.41498: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203921.41568: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203921.41604: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203921.41681: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203921.41685: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203921.41741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203921.41769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203921.41795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203921.41833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203921.41847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203921.41902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203921.41923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203921.41945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203921.41983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203921.41996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203921.42157: variable '__network_required_facts' from source: role '' defaults 19285 1727203921.42160: variable 'ansible_facts' from source: unknown 19285 1727203921.43135: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 19285 1727203921.43139: when evaluation is False, skipping this task 19285 1727203921.43142: _execute() done 19285 1727203921.43144: dumping result to json 19285 1727203921.43146: done dumping result, returning 19285 1727203921.43149: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [028d2410-947f-f31b-fb3f-0000000002f6] 19285 1727203921.43152: sending task result for task 028d2410-947f-f31b-fb3f-0000000002f6 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19285 1727203921.43449: no more pending results, returning what we have 19285 1727203921.43452: results queue empty 19285 1727203921.43453: checking for any_errors_fatal 19285 1727203921.43455: done checking for any_errors_fatal 19285 1727203921.43456: checking for max_fail_percentage 19285 1727203921.43458: done checking for max_fail_percentage 19285 1727203921.43458: checking to see if all hosts have failed and the running result is not ok 19285 1727203921.43459: done checking to see if all hosts have failed 19285 1727203921.43460: getting the remaining hosts for this loop 19285 1727203921.43463: done getting the remaining hosts for this loop 19285 1727203921.43466: getting the next task for host managed-node2 19285 1727203921.43474: done getting next task for host managed-node2 19285 1727203921.43479: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 19285 1727203921.43482: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203921.43500: getting variables 19285 1727203921.43501: in VariableManager get_vars() 19285 1727203921.43701: Calling all_inventory to load vars for managed-node2 19285 1727203921.43704: Calling groups_inventory to load vars for managed-node2 19285 1727203921.43707: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203921.43717: Calling all_plugins_play to load vars for managed-node2 19285 1727203921.43720: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203921.43723: Calling groups_plugins_play to load vars for managed-node2 19285 1727203921.44440: done sending task result for task 028d2410-947f-f31b-fb3f-0000000002f6 19285 1727203921.44444: WORKER PROCESS EXITING 19285 1727203921.46252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203921.48156: done with get_vars() 19285 1727203921.48188: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:52:01 -0400 (0:00:00.115) 0:00:20.557 ***** 19285 1727203921.48295: entering _queue_task() for managed-node2/stat 19285 1727203921.48716: worker is 1 (out of 1 available) 19285 1727203921.48726: exiting _queue_task() for managed-node2/stat 19285 1727203921.48737: done queuing things up, now waiting for results queue to drain 19285 1727203921.48739: waiting for pending results... 19285 1727203921.48988: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 19285 1727203921.49137: in run() - task 028d2410-947f-f31b-fb3f-0000000002f8 19285 1727203921.49158: variable 'ansible_search_path' from source: unknown 19285 1727203921.49165: variable 'ansible_search_path' from source: unknown 19285 1727203921.49206: calling self._execute() 19285 1727203921.49309: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203921.49320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203921.49344: variable 'omit' from source: magic vars 19285 1727203921.49739: variable 'ansible_distribution_major_version' from source: facts 19285 1727203921.49757: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203921.50037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203921.50342: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203921.50399: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203921.50441: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203921.50484: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203921.50698: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203921.50701: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203921.50900: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203921.50953: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203921.51191: variable '__network_is_ostree' from source: set_fact 19285 1727203921.51388: Evaluated conditional (not __network_is_ostree is defined): False 19285 1727203921.51391: when evaluation is False, skipping this task 19285 1727203921.51394: _execute() done 19285 1727203921.51396: dumping result to json 19285 1727203921.51399: done dumping result, returning 19285 1727203921.51401: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [028d2410-947f-f31b-fb3f-0000000002f8] 19285 1727203921.51404: sending task result for task 028d2410-947f-f31b-fb3f-0000000002f8 19285 1727203921.51478: done sending task result for task 028d2410-947f-f31b-fb3f-0000000002f8 19285 1727203921.51482: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 19285 1727203921.51536: no more pending results, returning what we have 19285 1727203921.51541: results queue empty 19285 1727203921.51542: checking for any_errors_fatal 19285 1727203921.51549: done checking for any_errors_fatal 19285 1727203921.51550: checking for max_fail_percentage 19285 1727203921.51552: done checking for max_fail_percentage 19285 1727203921.51553: checking to see if all hosts have failed and the running result is not ok 19285 1727203921.51554: done checking to see if all hosts have failed 19285 1727203921.51554: getting the remaining hosts for this loop 19285 1727203921.51556: done getting the remaining hosts for this loop 19285 1727203921.51563: getting the next task for host managed-node2 19285 1727203921.51572: done getting next task for host managed-node2 19285 1727203921.51578: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 19285 1727203921.51581: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203921.51597: getting variables 19285 1727203921.51599: in VariableManager get_vars() 19285 1727203921.51639: Calling all_inventory to load vars for managed-node2 19285 1727203921.51643: Calling groups_inventory to load vars for managed-node2 19285 1727203921.51646: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203921.51657: Calling all_plugins_play to load vars for managed-node2 19285 1727203921.51660: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203921.51666: Calling groups_plugins_play to load vars for managed-node2 19285 1727203921.56000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203921.60322: done with get_vars() 19285 1727203921.60502: done getting variables 19285 1727203921.60570: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:52:01 -0400 (0:00:00.123) 0:00:20.680 ***** 19285 1727203921.60636: entering _queue_task() for managed-node2/set_fact 19285 1727203921.61392: worker is 1 (out of 1 available) 19285 1727203921.61404: exiting _queue_task() for managed-node2/set_fact 19285 1727203921.61425: done queuing things up, now waiting for results queue to drain 19285 1727203921.61427: waiting for pending results... 19285 1727203921.61692: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 19285 1727203921.61836: in run() - task 028d2410-947f-f31b-fb3f-0000000002f9 19285 1727203921.61873: variable 'ansible_search_path' from source: unknown 19285 1727203921.61981: variable 'ansible_search_path' from source: unknown 19285 1727203921.61984: calling self._execute() 19285 1727203921.62014: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203921.62028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203921.62043: variable 'omit' from source: magic vars 19285 1727203921.62412: variable 'ansible_distribution_major_version' from source: facts 19285 1727203921.62442: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203921.62656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203921.62988: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203921.63037: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203921.63080: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203921.63116: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203921.63292: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203921.63295: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203921.63318: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203921.63346: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203921.63449: variable '__network_is_ostree' from source: set_fact 19285 1727203921.63464: Evaluated conditional (not __network_is_ostree is defined): False 19285 1727203921.63472: when evaluation is False, skipping this task 19285 1727203921.63509: _execute() done 19285 1727203921.63512: dumping result to json 19285 1727203921.63515: done dumping result, returning 19285 1727203921.63518: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [028d2410-947f-f31b-fb3f-0000000002f9] 19285 1727203921.63521: sending task result for task 028d2410-947f-f31b-fb3f-0000000002f9 skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 19285 1727203921.63665: no more pending results, returning what we have 19285 1727203921.63669: results queue empty 19285 1727203921.63670: checking for any_errors_fatal 19285 1727203921.63678: done checking for any_errors_fatal 19285 1727203921.63679: checking for max_fail_percentage 19285 1727203921.63680: done checking for max_fail_percentage 19285 1727203921.63681: checking to see if all hosts have failed and the running result is not ok 19285 1727203921.63682: done checking to see if all hosts have failed 19285 1727203921.63682: getting the remaining hosts for this loop 19285 1727203921.63685: done getting the remaining hosts for this loop 19285 1727203921.63689: getting the next task for host managed-node2 19285 1727203921.63699: done getting next task for host managed-node2 19285 1727203921.63703: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 19285 1727203921.63706: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203921.63719: getting variables 19285 1727203921.63721: in VariableManager get_vars() 19285 1727203921.63756: Calling all_inventory to load vars for managed-node2 19285 1727203921.63759: Calling groups_inventory to load vars for managed-node2 19285 1727203921.63761: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203921.63772: Calling all_plugins_play to load vars for managed-node2 19285 1727203921.63774: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203921.64085: Calling groups_plugins_play to load vars for managed-node2 19285 1727203921.64690: done sending task result for task 028d2410-947f-f31b-fb3f-0000000002f9 19285 1727203921.64694: WORKER PROCESS EXITING 19285 1727203921.65595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203921.68797: done with get_vars() 19285 1727203921.68829: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:52:01 -0400 (0:00:00.083) 0:00:20.764 ***** 19285 1727203921.68959: entering _queue_task() for managed-node2/service_facts 19285 1727203921.69324: worker is 1 (out of 1 available) 19285 1727203921.69338: exiting _queue_task() for managed-node2/service_facts 19285 1727203921.69350: done queuing things up, now waiting for results queue to drain 19285 1727203921.69352: waiting for pending results... 19285 1727203921.69644: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 19285 1727203921.69795: in run() - task 028d2410-947f-f31b-fb3f-0000000002fb 19285 1727203921.69820: variable 'ansible_search_path' from source: unknown 19285 1727203921.69829: variable 'ansible_search_path' from source: unknown 19285 1727203921.69872: calling self._execute() 19285 1727203921.69993: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203921.70005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203921.70025: variable 'omit' from source: magic vars 19285 1727203921.70410: variable 'ansible_distribution_major_version' from source: facts 19285 1727203921.70429: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203921.70442: variable 'omit' from source: magic vars 19285 1727203921.70506: variable 'omit' from source: magic vars 19285 1727203921.70549: variable 'omit' from source: magic vars 19285 1727203921.70605: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203921.70647: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203921.70680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203921.70703: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203921.70720: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203921.70757: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203921.70765: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203921.70775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203921.70891: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203921.70951: Set connection var ansible_pipelining to False 19285 1727203921.70954: Set connection var ansible_timeout to 10 19285 1727203921.70956: Set connection var ansible_shell_type to sh 19285 1727203921.70958: Set connection var ansible_shell_executable to /bin/sh 19285 1727203921.70960: Set connection var ansible_connection to ssh 19285 1727203921.71005: variable 'ansible_shell_executable' from source: unknown 19285 1727203921.71010: variable 'ansible_connection' from source: unknown 19285 1727203921.71025: variable 'ansible_module_compression' from source: unknown 19285 1727203921.71108: variable 'ansible_shell_type' from source: unknown 19285 1727203921.71111: variable 'ansible_shell_executable' from source: unknown 19285 1727203921.71113: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203921.71116: variable 'ansible_pipelining' from source: unknown 19285 1727203921.71118: variable 'ansible_timeout' from source: unknown 19285 1727203921.71120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203921.71315: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19285 1727203921.71348: variable 'omit' from source: magic vars 19285 1727203921.71359: starting attempt loop 19285 1727203921.71377: running the handler 19285 1727203921.71408: _low_level_execute_command(): starting 19285 1727203921.71422: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203921.73364: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203921.73391: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203921.73540: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203921.73647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203921.73780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203921.75531: stdout chunk (state=3): >>>/root <<< 19285 1727203921.75684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203921.75711: stdout chunk (state=3): >>><<< 19285 1727203921.75714: stderr chunk (state=3): >>><<< 19285 1727203921.75883: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203921.75888: _low_level_execute_command(): starting 19285 1727203921.75890: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203921.75818-20941-12222292434928 `" && echo ansible-tmp-1727203921.75818-20941-12222292434928="` echo /root/.ansible/tmp/ansible-tmp-1727203921.75818-20941-12222292434928 `" ) && sleep 0' 19285 1727203921.77685: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203921.77801: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203921.77837: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 19285 1727203921.77856: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203921.78443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203921.78447: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203921.78463: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203921.78554: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203921.80485: stdout chunk (state=3): >>>ansible-tmp-1727203921.75818-20941-12222292434928=/root/.ansible/tmp/ansible-tmp-1727203921.75818-20941-12222292434928 <<< 19285 1727203921.80591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203921.80767: stderr chunk (state=3): >>><<< 19285 1727203921.80771: stdout chunk (state=3): >>><<< 19285 1727203921.80796: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203921.75818-20941-12222292434928=/root/.ansible/tmp/ansible-tmp-1727203921.75818-20941-12222292434928 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203921.80865: variable 'ansible_module_compression' from source: unknown 19285 1727203921.81015: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 19285 1727203921.81059: variable 'ansible_facts' from source: unknown 19285 1727203921.81192: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203921.75818-20941-12222292434928/AnsiballZ_service_facts.py 19285 1727203921.81624: Sending initial data 19285 1727203921.81635: Sent initial data (159 bytes) 19285 1727203921.83336: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203921.83722: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203921.83823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203921.85442: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203921.85670: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203921.85738: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmpmhu0xvh_ /root/.ansible/tmp/ansible-tmp-1727203921.75818-20941-12222292434928/AnsiballZ_service_facts.py <<< 19285 1727203921.85757: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203921.75818-20941-12222292434928/AnsiballZ_service_facts.py" <<< 19285 1727203921.85900: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmpmhu0xvh_" to remote "/root/.ansible/tmp/ansible-tmp-1727203921.75818-20941-12222292434928/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203921.75818-20941-12222292434928/AnsiballZ_service_facts.py" <<< 19285 1727203921.87933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203921.87938: stderr chunk (state=3): >>><<< 19285 1727203921.87941: stdout chunk (state=3): >>><<< 19285 1727203921.88274: done transferring module to remote 19285 1727203921.88383: _low_level_execute_command(): starting 19285 1727203921.88386: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203921.75818-20941-12222292434928/ /root/.ansible/tmp/ansible-tmp-1727203921.75818-20941-12222292434928/AnsiballZ_service_facts.py && sleep 0' 19285 1727203921.89846: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203921.89935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203921.90093: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203921.90118: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203921.90292: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203921.92311: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203921.92318: stdout chunk (state=3): >>><<< 19285 1727203921.92321: stderr chunk (state=3): >>><<< 19285 1727203921.92421: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203921.92425: _low_level_execute_command(): starting 19285 1727203921.92428: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203921.75818-20941-12222292434928/AnsiballZ_service_facts.py && sleep 0' 19285 1727203921.93783: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203921.93887: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203921.93900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203921.93915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203921.93929: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203921.94188: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203921.94302: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203921.94412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203923.49324: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 19285 1727203923.49339: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "st<<< 19285 1727203923.49360: stdout chunk (state=3): >>>atic", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 19285 1727203923.50868: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203923.51025: stdout chunk (state=3): >>><<< 19285 1727203923.51028: stderr chunk (state=3): >>><<< 19285 1727203923.51383: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203923.53699: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203921.75818-20941-12222292434928/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203923.53717: _low_level_execute_command(): starting 19285 1727203923.53765: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203921.75818-20941-12222292434928/ > /dev/null 2>&1 && sleep 0' 19285 1727203923.54795: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203923.54810: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203923.54827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203923.54846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203923.54866: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203923.54891: stderr chunk (state=3): >>>debug2: match not found <<< 19285 1727203923.54986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203923.55196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203923.55368: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203923.57304: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203923.57316: stdout chunk (state=3): >>><<< 19285 1727203923.57500: stderr chunk (state=3): >>><<< 19285 1727203923.57504: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203923.57551: handler run complete 19285 1727203923.58314: variable 'ansible_facts' from source: unknown 19285 1727203923.58594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203923.59609: variable 'ansible_facts' from source: unknown 19285 1727203923.59881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203923.60197: attempt loop complete, returning result 19285 1727203923.60203: _execute() done 19285 1727203923.60205: dumping result to json 19285 1727203923.60295: done dumping result, returning 19285 1727203923.60304: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [028d2410-947f-f31b-fb3f-0000000002fb] 19285 1727203923.60310: sending task result for task 028d2410-947f-f31b-fb3f-0000000002fb 19285 1727203923.61638: done sending task result for task 028d2410-947f-f31b-fb3f-0000000002fb ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19285 1727203923.61794: no more pending results, returning what we have 19285 1727203923.61797: results queue empty 19285 1727203923.61798: checking for any_errors_fatal 19285 1727203923.61802: done checking for any_errors_fatal 19285 1727203923.61803: checking for max_fail_percentage 19285 1727203923.61805: done checking for max_fail_percentage 19285 1727203923.61806: checking to see if all hosts have failed and the running result is not ok 19285 1727203923.61806: done checking to see if all hosts have failed 19285 1727203923.61807: getting the remaining hosts for this loop 19285 1727203923.61809: done getting the remaining hosts for this loop 19285 1727203923.61812: getting the next task for host managed-node2 19285 1727203923.61817: done getting next task for host managed-node2 19285 1727203923.61912: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 19285 1727203923.61916: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203923.61926: WORKER PROCESS EXITING 19285 1727203923.61957: getting variables 19285 1727203923.61959: in VariableManager get_vars() 19285 1727203923.61992: Calling all_inventory to load vars for managed-node2 19285 1727203923.61995: Calling groups_inventory to load vars for managed-node2 19285 1727203923.61998: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203923.62007: Calling all_plugins_play to load vars for managed-node2 19285 1727203923.62010: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203923.62013: Calling groups_plugins_play to load vars for managed-node2 19285 1727203923.63525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203923.66424: done with get_vars() 19285 1727203923.66497: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:52:03 -0400 (0:00:01.976) 0:00:22.740 ***** 19285 1727203923.66601: entering _queue_task() for managed-node2/package_facts 19285 1727203923.67154: worker is 1 (out of 1 available) 19285 1727203923.67171: exiting _queue_task() for managed-node2/package_facts 19285 1727203923.67210: done queuing things up, now waiting for results queue to drain 19285 1727203923.67212: waiting for pending results... 19285 1727203923.67503: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 19285 1727203923.67658: in run() - task 028d2410-947f-f31b-fb3f-0000000002fc 19285 1727203923.67730: variable 'ansible_search_path' from source: unknown 19285 1727203923.67734: variable 'ansible_search_path' from source: unknown 19285 1727203923.67736: calling self._execute() 19285 1727203923.67870: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203923.67916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203923.67952: variable 'omit' from source: magic vars 19285 1727203923.68497: variable 'ansible_distribution_major_version' from source: facts 19285 1727203923.68501: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203923.68507: variable 'omit' from source: magic vars 19285 1727203923.68571: variable 'omit' from source: magic vars 19285 1727203923.68623: variable 'omit' from source: magic vars 19285 1727203923.68684: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203923.68745: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203923.68815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203923.68818: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203923.68839: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203923.68878: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203923.68924: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203923.68928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203923.69014: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203923.69032: Set connection var ansible_pipelining to False 19285 1727203923.69045: Set connection var ansible_timeout to 10 19285 1727203923.69052: Set connection var ansible_shell_type to sh 19285 1727203923.69070: Set connection var ansible_shell_executable to /bin/sh 19285 1727203923.69141: Set connection var ansible_connection to ssh 19285 1727203923.69145: variable 'ansible_shell_executable' from source: unknown 19285 1727203923.69147: variable 'ansible_connection' from source: unknown 19285 1727203923.69150: variable 'ansible_module_compression' from source: unknown 19285 1727203923.69152: variable 'ansible_shell_type' from source: unknown 19285 1727203923.69154: variable 'ansible_shell_executable' from source: unknown 19285 1727203923.69156: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203923.69158: variable 'ansible_pipelining' from source: unknown 19285 1727203923.69160: variable 'ansible_timeout' from source: unknown 19285 1727203923.69165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203923.69374: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19285 1727203923.69396: variable 'omit' from source: magic vars 19285 1727203923.69405: starting attempt loop 19285 1727203923.69411: running the handler 19285 1727203923.69430: _low_level_execute_command(): starting 19285 1727203923.69470: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203923.70406: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203923.70479: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203923.70569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203923.70592: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203923.70631: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203923.70689: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203923.70804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203923.72524: stdout chunk (state=3): >>>/root <<< 19285 1727203923.72835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203923.72838: stdout chunk (state=3): >>><<< 19285 1727203923.72840: stderr chunk (state=3): >>><<< 19285 1727203923.72844: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203923.72847: _low_level_execute_command(): starting 19285 1727203923.72849: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203923.7271037-21239-177234439091144 `" && echo ansible-tmp-1727203923.7271037-21239-177234439091144="` echo /root/.ansible/tmp/ansible-tmp-1727203923.7271037-21239-177234439091144 `" ) && sleep 0' 19285 1727203923.73497: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203923.73538: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203923.73564: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203923.73626: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203923.73734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203923.75664: stdout chunk (state=3): >>>ansible-tmp-1727203923.7271037-21239-177234439091144=/root/.ansible/tmp/ansible-tmp-1727203923.7271037-21239-177234439091144 <<< 19285 1727203923.75799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203923.75838: stdout chunk (state=3): >>><<< 19285 1727203923.75842: stderr chunk (state=3): >>><<< 19285 1727203923.76089: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203923.7271037-21239-177234439091144=/root/.ansible/tmp/ansible-tmp-1727203923.7271037-21239-177234439091144 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203923.76093: variable 'ansible_module_compression' from source: unknown 19285 1727203923.76095: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 19285 1727203923.76100: variable 'ansible_facts' from source: unknown 19285 1727203923.76269: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203923.7271037-21239-177234439091144/AnsiballZ_package_facts.py 19285 1727203923.76394: Sending initial data 19285 1727203923.76397: Sent initial data (162 bytes) 19285 1727203923.76971: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203923.76977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 19285 1727203923.76980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203923.76983: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203923.76986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203923.77039: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203923.77043: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203923.77110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203923.78718: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203923.78822: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203923.78858: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmpn77xxp4l /root/.ansible/tmp/ansible-tmp-1727203923.7271037-21239-177234439091144/AnsiballZ_package_facts.py <<< 19285 1727203923.78861: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203923.7271037-21239-177234439091144/AnsiballZ_package_facts.py" <<< 19285 1727203923.78944: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmpn77xxp4l" to remote "/root/.ansible/tmp/ansible-tmp-1727203923.7271037-21239-177234439091144/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203923.7271037-21239-177234439091144/AnsiballZ_package_facts.py" <<< 19285 1727203923.80297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203923.80337: stderr chunk (state=3): >>><<< 19285 1727203923.80340: stdout chunk (state=3): >>><<< 19285 1727203923.80372: done transferring module to remote 19285 1727203923.80384: _low_level_execute_command(): starting 19285 1727203923.80389: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203923.7271037-21239-177234439091144/ /root/.ansible/tmp/ansible-tmp-1727203923.7271037-21239-177234439091144/AnsiballZ_package_facts.py && sleep 0' 19285 1727203923.80978: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203923.81037: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203923.81099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203923.82942: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203923.82994: stderr chunk (state=3): >>><<< 19285 1727203923.83004: stdout chunk (state=3): >>><<< 19285 1727203923.83159: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203923.83163: _low_level_execute_command(): starting 19285 1727203923.83169: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203923.7271037-21239-177234439091144/AnsiballZ_package_facts.py && sleep 0' 19285 1727203923.83915: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203923.83981: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19285 1727203923.84005: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203923.84052: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203923.84080: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203923.84127: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203923.84226: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203924.29087: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 19285 1727203924.29105: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source":<<< 19285 1727203924.29117: stdout chunk (state=3): >>> "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "<<< 19285 1727203924.29143: stdout chunk (state=3): >>>3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch",<<< 19285 1727203924.29169: stdout chunk (state=3): >>> "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch<<< 19285 1727203924.29207: stdout chunk (state=3): >>>": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 19285 1727203924.31056: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203924.31065: stdout chunk (state=3): >>><<< 19285 1727203924.31069: stderr chunk (state=3): >>><<< 19285 1727203924.31291: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203924.33551: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203923.7271037-21239-177234439091144/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203924.33589: _low_level_execute_command(): starting 19285 1727203924.33600: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203923.7271037-21239-177234439091144/ > /dev/null 2>&1 && sleep 0' 19285 1727203924.34199: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203924.34213: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203924.34319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203924.34331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 19285 1727203924.34448: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203924.34520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203924.36458: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203924.36462: stdout chunk (state=3): >>><<< 19285 1727203924.36464: stderr chunk (state=3): >>><<< 19285 1727203924.36487: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203924.36703: handler run complete 19285 1727203924.37606: variable 'ansible_facts' from source: unknown 19285 1727203924.38367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203924.41285: variable 'ansible_facts' from source: unknown 19285 1727203924.42130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203924.42891: attempt loop complete, returning result 19285 1727203924.42909: _execute() done 19285 1727203924.42922: dumping result to json 19285 1727203924.43168: done dumping result, returning 19285 1727203924.43188: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [028d2410-947f-f31b-fb3f-0000000002fc] 19285 1727203924.43197: sending task result for task 028d2410-947f-f31b-fb3f-0000000002fc 19285 1727203924.46425: done sending task result for task 028d2410-947f-f31b-fb3f-0000000002fc 19285 1727203924.46428: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19285 1727203924.46582: no more pending results, returning what we have 19285 1727203924.46585: results queue empty 19285 1727203924.46586: checking for any_errors_fatal 19285 1727203924.46590: done checking for any_errors_fatal 19285 1727203924.46591: checking for max_fail_percentage 19285 1727203924.46593: done checking for max_fail_percentage 19285 1727203924.46594: checking to see if all hosts have failed and the running result is not ok 19285 1727203924.46594: done checking to see if all hosts have failed 19285 1727203924.46595: getting the remaining hosts for this loop 19285 1727203924.46598: done getting the remaining hosts for this loop 19285 1727203924.46602: getting the next task for host managed-node2 19285 1727203924.46609: done getting next task for host managed-node2 19285 1727203924.46613: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 19285 1727203924.46615: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203924.46624: getting variables 19285 1727203924.46626: in VariableManager get_vars() 19285 1727203924.46704: Calling all_inventory to load vars for managed-node2 19285 1727203924.46708: Calling groups_inventory to load vars for managed-node2 19285 1727203924.46711: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203924.46720: Calling all_plugins_play to load vars for managed-node2 19285 1727203924.46722: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203924.46725: Calling groups_plugins_play to load vars for managed-node2 19285 1727203924.48080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203924.50149: done with get_vars() 19285 1727203924.50185: done getting variables 19285 1727203924.50268: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:52:04 -0400 (0:00:00.837) 0:00:23.577 ***** 19285 1727203924.50317: entering _queue_task() for managed-node2/debug 19285 1727203924.50739: worker is 1 (out of 1 available) 19285 1727203924.50751: exiting _queue_task() for managed-node2/debug 19285 1727203924.50763: done queuing things up, now waiting for results queue to drain 19285 1727203924.50764: waiting for pending results... 19285 1727203924.51198: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 19285 1727203924.51206: in run() - task 028d2410-947f-f31b-fb3f-00000000003b 19285 1727203924.51209: variable 'ansible_search_path' from source: unknown 19285 1727203924.51212: variable 'ansible_search_path' from source: unknown 19285 1727203924.51262: calling self._execute() 19285 1727203924.51498: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203924.51511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203924.51533: variable 'omit' from source: magic vars 19285 1727203924.52158: variable 'ansible_distribution_major_version' from source: facts 19285 1727203924.52184: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203924.52229: variable 'omit' from source: magic vars 19285 1727203924.52291: variable 'omit' from source: magic vars 19285 1727203924.52388: variable 'network_provider' from source: set_fact 19285 1727203924.52481: variable 'omit' from source: magic vars 19285 1727203924.52484: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203924.52530: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203924.52555: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203924.52580: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203924.52596: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203924.52653: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203924.52670: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203924.52717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203924.52865: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203924.52880: Set connection var ansible_pipelining to False 19285 1727203924.52907: Set connection var ansible_timeout to 10 19285 1727203924.52910: Set connection var ansible_shell_type to sh 19285 1727203924.52924: Set connection var ansible_shell_executable to /bin/sh 19285 1727203924.52943: Set connection var ansible_connection to ssh 19285 1727203924.52994: variable 'ansible_shell_executable' from source: unknown 19285 1727203924.53000: variable 'ansible_connection' from source: unknown 19285 1727203924.53003: variable 'ansible_module_compression' from source: unknown 19285 1727203924.53005: variable 'ansible_shell_type' from source: unknown 19285 1727203924.53007: variable 'ansible_shell_executable' from source: unknown 19285 1727203924.53008: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203924.53059: variable 'ansible_pipelining' from source: unknown 19285 1727203924.53062: variable 'ansible_timeout' from source: unknown 19285 1727203924.53064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203924.53246: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203924.53270: variable 'omit' from source: magic vars 19285 1727203924.53272: starting attempt loop 19285 1727203924.53383: running the handler 19285 1727203924.53386: handler run complete 19285 1727203924.53389: attempt loop complete, returning result 19285 1727203924.53392: _execute() done 19285 1727203924.53394: dumping result to json 19285 1727203924.53397: done dumping result, returning 19285 1727203924.53399: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [028d2410-947f-f31b-fb3f-00000000003b] 19285 1727203924.53401: sending task result for task 028d2410-947f-f31b-fb3f-00000000003b ok: [managed-node2] => {} MSG: Using network provider: nm 19285 1727203924.53551: no more pending results, returning what we have 19285 1727203924.53555: results queue empty 19285 1727203924.53556: checking for any_errors_fatal 19285 1727203924.53568: done checking for any_errors_fatal 19285 1727203924.53569: checking for max_fail_percentage 19285 1727203924.53572: done checking for max_fail_percentage 19285 1727203924.53573: checking to see if all hosts have failed and the running result is not ok 19285 1727203924.53574: done checking to see if all hosts have failed 19285 1727203924.53575: getting the remaining hosts for this loop 19285 1727203924.53579: done getting the remaining hosts for this loop 19285 1727203924.53583: getting the next task for host managed-node2 19285 1727203924.53593: done getting next task for host managed-node2 19285 1727203924.53597: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 19285 1727203924.53599: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203924.53611: getting variables 19285 1727203924.53613: in VariableManager get_vars() 19285 1727203924.53655: Calling all_inventory to load vars for managed-node2 19285 1727203924.53658: Calling groups_inventory to load vars for managed-node2 19285 1727203924.53660: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203924.53672: Calling all_plugins_play to load vars for managed-node2 19285 1727203924.53675: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203924.53908: Calling groups_plugins_play to load vars for managed-node2 19285 1727203924.54522: done sending task result for task 028d2410-947f-f31b-fb3f-00000000003b 19285 1727203924.54525: WORKER PROCESS EXITING 19285 1727203924.55525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203924.56726: done with get_vars() 19285 1727203924.56758: done getting variables 19285 1727203924.56852: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:52:04 -0400 (0:00:00.065) 0:00:23.643 ***** 19285 1727203924.56906: entering _queue_task() for managed-node2/fail 19285 1727203924.57229: worker is 1 (out of 1 available) 19285 1727203924.57243: exiting _queue_task() for managed-node2/fail 19285 1727203924.57255: done queuing things up, now waiting for results queue to drain 19285 1727203924.57257: waiting for pending results... 19285 1727203924.57425: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 19285 1727203924.57575: in run() - task 028d2410-947f-f31b-fb3f-00000000003c 19285 1727203924.57582: variable 'ansible_search_path' from source: unknown 19285 1727203924.57586: variable 'ansible_search_path' from source: unknown 19285 1727203924.57682: calling self._execute() 19285 1727203924.57754: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203924.57758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203924.57761: variable 'omit' from source: magic vars 19285 1727203924.58558: variable 'ansible_distribution_major_version' from source: facts 19285 1727203924.58564: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203924.58786: variable 'network_state' from source: role '' defaults 19285 1727203924.58868: Evaluated conditional (network_state != {}): False 19285 1727203924.58952: when evaluation is False, skipping this task 19285 1727203924.58956: _execute() done 19285 1727203924.58958: dumping result to json 19285 1727203924.59038: done dumping result, returning 19285 1727203924.59041: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [028d2410-947f-f31b-fb3f-00000000003c] 19285 1727203924.59045: sending task result for task 028d2410-947f-f31b-fb3f-00000000003c 19285 1727203924.59118: done sending task result for task 028d2410-947f-f31b-fb3f-00000000003c 19285 1727203924.59121: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19285 1727203924.59171: no more pending results, returning what we have 19285 1727203924.59178: results queue empty 19285 1727203924.59179: checking for any_errors_fatal 19285 1727203924.59188: done checking for any_errors_fatal 19285 1727203924.59189: checking for max_fail_percentage 19285 1727203924.59191: done checking for max_fail_percentage 19285 1727203924.59192: checking to see if all hosts have failed and the running result is not ok 19285 1727203924.59193: done checking to see if all hosts have failed 19285 1727203924.59194: getting the remaining hosts for this loop 19285 1727203924.59195: done getting the remaining hosts for this loop 19285 1727203924.59199: getting the next task for host managed-node2 19285 1727203924.59206: done getting next task for host managed-node2 19285 1727203924.59211: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 19285 1727203924.59214: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203924.59232: getting variables 19285 1727203924.59234: in VariableManager get_vars() 19285 1727203924.59284: Calling all_inventory to load vars for managed-node2 19285 1727203924.59292: Calling groups_inventory to load vars for managed-node2 19285 1727203924.59295: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203924.59309: Calling all_plugins_play to load vars for managed-node2 19285 1727203924.59313: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203924.59316: Calling groups_plugins_play to load vars for managed-node2 19285 1727203924.61238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203924.62703: done with get_vars() 19285 1727203924.62723: done getting variables 19285 1727203924.62769: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:52:04 -0400 (0:00:00.059) 0:00:23.702 ***** 19285 1727203924.62817: entering _queue_task() for managed-node2/fail 19285 1727203924.63158: worker is 1 (out of 1 available) 19285 1727203924.63172: exiting _queue_task() for managed-node2/fail 19285 1727203924.63187: done queuing things up, now waiting for results queue to drain 19285 1727203924.63188: waiting for pending results... 19285 1727203924.63694: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 19285 1727203924.63699: in run() - task 028d2410-947f-f31b-fb3f-00000000003d 19285 1727203924.63703: variable 'ansible_search_path' from source: unknown 19285 1727203924.63706: variable 'ansible_search_path' from source: unknown 19285 1727203924.63713: calling self._execute() 19285 1727203924.63769: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203924.63782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203924.63796: variable 'omit' from source: magic vars 19285 1727203924.64150: variable 'ansible_distribution_major_version' from source: facts 19285 1727203924.64392: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203924.64397: variable 'network_state' from source: role '' defaults 19285 1727203924.64400: Evaluated conditional (network_state != {}): False 19285 1727203924.64402: when evaluation is False, skipping this task 19285 1727203924.64405: _execute() done 19285 1727203924.64407: dumping result to json 19285 1727203924.64409: done dumping result, returning 19285 1727203924.64411: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [028d2410-947f-f31b-fb3f-00000000003d] 19285 1727203924.64414: sending task result for task 028d2410-947f-f31b-fb3f-00000000003d 19285 1727203924.64947: done sending task result for task 028d2410-947f-f31b-fb3f-00000000003d 19285 1727203924.64951: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19285 1727203924.64998: no more pending results, returning what we have 19285 1727203924.65001: results queue empty 19285 1727203924.65002: checking for any_errors_fatal 19285 1727203924.65006: done checking for any_errors_fatal 19285 1727203924.65007: checking for max_fail_percentage 19285 1727203924.65009: done checking for max_fail_percentage 19285 1727203924.65010: checking to see if all hosts have failed and the running result is not ok 19285 1727203924.65011: done checking to see if all hosts have failed 19285 1727203924.65011: getting the remaining hosts for this loop 19285 1727203924.65013: done getting the remaining hosts for this loop 19285 1727203924.65017: getting the next task for host managed-node2 19285 1727203924.65023: done getting next task for host managed-node2 19285 1727203924.65027: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 19285 1727203924.65029: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203924.65042: getting variables 19285 1727203924.65044: in VariableManager get_vars() 19285 1727203924.65090: Calling all_inventory to load vars for managed-node2 19285 1727203924.65093: Calling groups_inventory to load vars for managed-node2 19285 1727203924.65096: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203924.65107: Calling all_plugins_play to load vars for managed-node2 19285 1727203924.65110: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203924.65113: Calling groups_plugins_play to load vars for managed-node2 19285 1727203924.67495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203924.69335: done with get_vars() 19285 1727203924.69364: done getting variables 19285 1727203924.69445: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:52:04 -0400 (0:00:00.066) 0:00:23.769 ***** 19285 1727203924.69482: entering _queue_task() for managed-node2/fail 19285 1727203924.69838: worker is 1 (out of 1 available) 19285 1727203924.69967: exiting _queue_task() for managed-node2/fail 19285 1727203924.69981: done queuing things up, now waiting for results queue to drain 19285 1727203924.69983: waiting for pending results... 19285 1727203924.70173: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 19285 1727203924.70273: in run() - task 028d2410-947f-f31b-fb3f-00000000003e 19285 1727203924.70293: variable 'ansible_search_path' from source: unknown 19285 1727203924.70297: variable 'ansible_search_path' from source: unknown 19285 1727203924.70339: calling self._execute() 19285 1727203924.70494: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203924.70498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203924.70515: variable 'omit' from source: magic vars 19285 1727203924.70972: variable 'ansible_distribution_major_version' from source: facts 19285 1727203924.70989: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203924.71178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203924.74058: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203924.74070: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203924.74162: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203924.74262: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203924.74338: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203924.74448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203924.74508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203924.74550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203924.74607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203924.74673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203924.74903: variable 'ansible_distribution_major_version' from source: facts 19285 1727203924.74922: Evaluated conditional (ansible_distribution_major_version | int > 9): True 19285 1727203924.75192: variable 'ansible_distribution' from source: facts 19285 1727203924.75195: variable '__network_rh_distros' from source: role '' defaults 19285 1727203924.75198: Evaluated conditional (ansible_distribution in __network_rh_distros): True 19285 1727203924.75668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203924.75721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203924.75880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203924.75892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203924.75895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203924.75897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203924.75899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203924.76093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203924.76101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203924.76104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203924.76106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203924.76109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203924.76111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203924.76146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203924.76192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203924.76746: variable 'network_connections' from source: play vars 19285 1727203924.76756: variable 'profile' from source: play vars 19285 1727203924.76829: variable 'profile' from source: play vars 19285 1727203924.76832: variable 'interface' from source: set_fact 19285 1727203924.76961: variable 'interface' from source: set_fact 19285 1727203924.76974: variable 'network_state' from source: role '' defaults 19285 1727203924.77108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203924.77286: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203924.77346: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203924.77394: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203924.77465: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203924.77679: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203924.77699: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203924.77702: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203924.77705: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203924.77708: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 19285 1727203924.77710: when evaluation is False, skipping this task 19285 1727203924.77712: _execute() done 19285 1727203924.77715: dumping result to json 19285 1727203924.77717: done dumping result, returning 19285 1727203924.77719: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [028d2410-947f-f31b-fb3f-00000000003e] 19285 1727203924.77721: sending task result for task 028d2410-947f-f31b-fb3f-00000000003e skipping: [managed-node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 19285 1727203924.77868: no more pending results, returning what we have 19285 1727203924.77872: results queue empty 19285 1727203924.77873: checking for any_errors_fatal 19285 1727203924.77881: done checking for any_errors_fatal 19285 1727203924.77882: checking for max_fail_percentage 19285 1727203924.77885: done checking for max_fail_percentage 19285 1727203924.77886: checking to see if all hosts have failed and the running result is not ok 19285 1727203924.77886: done checking to see if all hosts have failed 19285 1727203924.77887: getting the remaining hosts for this loop 19285 1727203924.77889: done getting the remaining hosts for this loop 19285 1727203924.77978: getting the next task for host managed-node2 19285 1727203924.77986: done getting next task for host managed-node2 19285 1727203924.77991: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 19285 1727203924.77993: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203924.78015: done sending task result for task 028d2410-947f-f31b-fb3f-00000000003e 19285 1727203924.78022: getting variables 19285 1727203924.78024: in VariableManager get_vars() 19285 1727203924.78063: Calling all_inventory to load vars for managed-node2 19285 1727203924.78066: Calling groups_inventory to load vars for managed-node2 19285 1727203924.78069: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203924.78083: Calling all_plugins_play to load vars for managed-node2 19285 1727203924.78088: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203924.78092: Calling groups_plugins_play to load vars for managed-node2 19285 1727203924.78692: WORKER PROCESS EXITING 19285 1727203924.79633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203924.82954: done with get_vars() 19285 1727203924.83004: done getting variables 19285 1727203924.83096: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:52:04 -0400 (0:00:00.136) 0:00:23.905 ***** 19285 1727203924.83126: entering _queue_task() for managed-node2/dnf 19285 1727203924.83477: worker is 1 (out of 1 available) 19285 1727203924.83490: exiting _queue_task() for managed-node2/dnf 19285 1727203924.83504: done queuing things up, now waiting for results queue to drain 19285 1727203924.83506: waiting for pending results... 19285 1727203924.83766: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 19285 1727203924.83866: in run() - task 028d2410-947f-f31b-fb3f-00000000003f 19285 1727203924.83870: variable 'ansible_search_path' from source: unknown 19285 1727203924.83874: variable 'ansible_search_path' from source: unknown 19285 1727203924.83973: calling self._execute() 19285 1727203924.83979: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203924.83982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203924.83986: variable 'omit' from source: magic vars 19285 1727203924.84364: variable 'ansible_distribution_major_version' from source: facts 19285 1727203924.84379: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203924.84564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203924.86731: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203924.86803: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203924.86993: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203924.87001: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203924.87004: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203924.87007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203924.87010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203924.87033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203924.87075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203924.87093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203924.87199: variable 'ansible_distribution' from source: facts 19285 1727203924.87203: variable 'ansible_distribution_major_version' from source: facts 19285 1727203924.87216: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 19285 1727203924.87368: variable '__network_wireless_connections_defined' from source: role '' defaults 19285 1727203924.87485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203924.87489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203924.87608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203924.87682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203924.87686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203924.87769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203924.87772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203924.87774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203924.88019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203924.88022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203924.88025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203924.88080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203924.88150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203924.88199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203924.88266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203924.88440: variable 'network_connections' from source: play vars 19285 1727203924.88467: variable 'profile' from source: play vars 19285 1727203924.88772: variable 'profile' from source: play vars 19285 1727203924.88812: variable 'interface' from source: set_fact 19285 1727203924.88882: variable 'interface' from source: set_fact 19285 1727203924.89429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203924.89755: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203924.89806: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203924.89845: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203924.89890: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203924.89938: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203924.89971: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203924.90266: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203924.90270: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203924.90456: variable '__network_team_connections_defined' from source: role '' defaults 19285 1727203924.91495: variable 'network_connections' from source: play vars 19285 1727203924.91498: variable 'profile' from source: play vars 19285 1727203924.91501: variable 'profile' from source: play vars 19285 1727203924.91503: variable 'interface' from source: set_fact 19285 1727203924.91505: variable 'interface' from source: set_fact 19285 1727203924.91523: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 19285 1727203924.91526: when evaluation is False, skipping this task 19285 1727203924.91529: _execute() done 19285 1727203924.91531: dumping result to json 19285 1727203924.91533: done dumping result, returning 19285 1727203924.91543: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [028d2410-947f-f31b-fb3f-00000000003f] 19285 1727203924.91549: sending task result for task 028d2410-947f-f31b-fb3f-00000000003f 19285 1727203924.91649: done sending task result for task 028d2410-947f-f31b-fb3f-00000000003f 19285 1727203924.91652: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 19285 1727203924.91710: no more pending results, returning what we have 19285 1727203924.91714: results queue empty 19285 1727203924.91715: checking for any_errors_fatal 19285 1727203924.91722: done checking for any_errors_fatal 19285 1727203924.91722: checking for max_fail_percentage 19285 1727203924.91724: done checking for max_fail_percentage 19285 1727203924.91725: checking to see if all hosts have failed and the running result is not ok 19285 1727203924.91726: done checking to see if all hosts have failed 19285 1727203924.91727: getting the remaining hosts for this loop 19285 1727203924.91728: done getting the remaining hosts for this loop 19285 1727203924.91732: getting the next task for host managed-node2 19285 1727203924.91739: done getting next task for host managed-node2 19285 1727203924.91743: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 19285 1727203924.91745: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203924.91759: getting variables 19285 1727203924.91760: in VariableManager get_vars() 19285 1727203924.91803: Calling all_inventory to load vars for managed-node2 19285 1727203924.91806: Calling groups_inventory to load vars for managed-node2 19285 1727203924.91809: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203924.91819: Calling all_plugins_play to load vars for managed-node2 19285 1727203924.91822: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203924.91825: Calling groups_plugins_play to load vars for managed-node2 19285 1727203924.93780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203924.97200: done with get_vars() 19285 1727203924.97231: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 19285 1727203924.97321: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:52:04 -0400 (0:00:00.142) 0:00:24.047 ***** 19285 1727203924.97351: entering _queue_task() for managed-node2/yum 19285 1727203924.98132: worker is 1 (out of 1 available) 19285 1727203924.98145: exiting _queue_task() for managed-node2/yum 19285 1727203924.98161: done queuing things up, now waiting for results queue to drain 19285 1727203924.98165: waiting for pending results... 19285 1727203924.98635: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 19285 1727203924.98983: in run() - task 028d2410-947f-f31b-fb3f-000000000040 19285 1727203924.98987: variable 'ansible_search_path' from source: unknown 19285 1727203924.98990: variable 'ansible_search_path' from source: unknown 19285 1727203924.98994: calling self._execute() 19285 1727203924.99354: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203924.99358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203924.99362: variable 'omit' from source: magic vars 19285 1727203924.99990: variable 'ansible_distribution_major_version' from source: facts 19285 1727203925.00180: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203925.00360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203925.02607: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203925.02680: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203925.02781: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203925.02784: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203925.02790: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203925.02872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203925.02907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203925.02940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203925.02988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203925.03007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203925.03108: variable 'ansible_distribution_major_version' from source: facts 19285 1727203925.03127: Evaluated conditional (ansible_distribution_major_version | int < 8): False 19285 1727203925.03134: when evaluation is False, skipping this task 19285 1727203925.03141: _execute() done 19285 1727203925.03160: dumping result to json 19285 1727203925.03163: done dumping result, returning 19285 1727203925.03269: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [028d2410-947f-f31b-fb3f-000000000040] 19285 1727203925.03273: sending task result for task 028d2410-947f-f31b-fb3f-000000000040 19285 1727203925.03345: done sending task result for task 028d2410-947f-f31b-fb3f-000000000040 19285 1727203925.03348: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 19285 1727203925.03401: no more pending results, returning what we have 19285 1727203925.03404: results queue empty 19285 1727203925.03405: checking for any_errors_fatal 19285 1727203925.03413: done checking for any_errors_fatal 19285 1727203925.03414: checking for max_fail_percentage 19285 1727203925.03415: done checking for max_fail_percentage 19285 1727203925.03416: checking to see if all hosts have failed and the running result is not ok 19285 1727203925.03417: done checking to see if all hosts have failed 19285 1727203925.03417: getting the remaining hosts for this loop 19285 1727203925.03419: done getting the remaining hosts for this loop 19285 1727203925.03422: getting the next task for host managed-node2 19285 1727203925.03428: done getting next task for host managed-node2 19285 1727203925.03432: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 19285 1727203925.03434: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203925.03447: getting variables 19285 1727203925.03448: in VariableManager get_vars() 19285 1727203925.03488: Calling all_inventory to load vars for managed-node2 19285 1727203925.03491: Calling groups_inventory to load vars for managed-node2 19285 1727203925.03493: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203925.03503: Calling all_plugins_play to load vars for managed-node2 19285 1727203925.03505: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203925.03508: Calling groups_plugins_play to load vars for managed-node2 19285 1727203925.06587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203925.09956: done with get_vars() 19285 1727203925.10106: done getting variables 19285 1727203925.10169: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:52:05 -0400 (0:00:00.128) 0:00:24.176 ***** 19285 1727203925.10284: entering _queue_task() for managed-node2/fail 19285 1727203925.11011: worker is 1 (out of 1 available) 19285 1727203925.11022: exiting _queue_task() for managed-node2/fail 19285 1727203925.11034: done queuing things up, now waiting for results queue to drain 19285 1727203925.11036: waiting for pending results... 19285 1727203925.11694: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 19285 1727203925.11701: in run() - task 028d2410-947f-f31b-fb3f-000000000041 19285 1727203925.11705: variable 'ansible_search_path' from source: unknown 19285 1727203925.11707: variable 'ansible_search_path' from source: unknown 19285 1727203925.11747: calling self._execute() 19285 1727203925.11843: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203925.12180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203925.12185: variable 'omit' from source: magic vars 19285 1727203925.12679: variable 'ansible_distribution_major_version' from source: facts 19285 1727203925.12795: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203925.13019: variable '__network_wireless_connections_defined' from source: role '' defaults 19285 1727203925.13390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203925.17484: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203925.17744: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203925.17792: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203925.17916: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203925.18112: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203925.18482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203925.18485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203925.18488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203925.18512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203925.18601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203925.18649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203925.18881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203925.18884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203925.18886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203925.18888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203925.18940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203925.19007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203925.19280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203925.19284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203925.19286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203925.19546: variable 'network_connections' from source: play vars 19285 1727203925.19881: variable 'profile' from source: play vars 19285 1727203925.19885: variable 'profile' from source: play vars 19285 1727203925.19888: variable 'interface' from source: set_fact 19285 1727203925.19891: variable 'interface' from source: set_fact 19285 1727203925.20098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203925.20484: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203925.20525: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203925.20561: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203925.20596: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203925.20723: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203925.20750: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203925.20782: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203925.20882: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203925.20937: variable '__network_team_connections_defined' from source: role '' defaults 19285 1727203925.21639: variable 'network_connections' from source: play vars 19285 1727203925.21790: variable 'profile' from source: play vars 19285 1727203925.21857: variable 'profile' from source: play vars 19285 1727203925.21867: variable 'interface' from source: set_fact 19285 1727203925.21930: variable 'interface' from source: set_fact 19285 1727203925.22009: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 19285 1727203925.22281: when evaluation is False, skipping this task 19285 1727203925.22284: _execute() done 19285 1727203925.22286: dumping result to json 19285 1727203925.22289: done dumping result, returning 19285 1727203925.22292: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [028d2410-947f-f31b-fb3f-000000000041] 19285 1727203925.22303: sending task result for task 028d2410-947f-f31b-fb3f-000000000041 19285 1727203925.22371: done sending task result for task 028d2410-947f-f31b-fb3f-000000000041 19285 1727203925.22374: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 19285 1727203925.22455: no more pending results, returning what we have 19285 1727203925.22458: results queue empty 19285 1727203925.22459: checking for any_errors_fatal 19285 1727203925.22468: done checking for any_errors_fatal 19285 1727203925.22469: checking for max_fail_percentage 19285 1727203925.22470: done checking for max_fail_percentage 19285 1727203925.22471: checking to see if all hosts have failed and the running result is not ok 19285 1727203925.22472: done checking to see if all hosts have failed 19285 1727203925.22472: getting the remaining hosts for this loop 19285 1727203925.22474: done getting the remaining hosts for this loop 19285 1727203925.22481: getting the next task for host managed-node2 19285 1727203925.22488: done getting next task for host managed-node2 19285 1727203925.22491: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 19285 1727203925.22493: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203925.22506: getting variables 19285 1727203925.22507: in VariableManager get_vars() 19285 1727203925.22543: Calling all_inventory to load vars for managed-node2 19285 1727203925.22546: Calling groups_inventory to load vars for managed-node2 19285 1727203925.22548: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203925.22557: Calling all_plugins_play to load vars for managed-node2 19285 1727203925.22559: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203925.22565: Calling groups_plugins_play to load vars for managed-node2 19285 1727203925.24644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203925.27483: done with get_vars() 19285 1727203925.27509: done getting variables 19285 1727203925.27688: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:52:05 -0400 (0:00:00.175) 0:00:24.351 ***** 19285 1727203925.27720: entering _queue_task() for managed-node2/package 19285 1727203925.28433: worker is 1 (out of 1 available) 19285 1727203925.28446: exiting _queue_task() for managed-node2/package 19285 1727203925.28458: done queuing things up, now waiting for results queue to drain 19285 1727203925.28460: waiting for pending results... 19285 1727203925.28879: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 19285 1727203925.28993: in run() - task 028d2410-947f-f31b-fb3f-000000000042 19285 1727203925.29015: variable 'ansible_search_path' from source: unknown 19285 1727203925.29023: variable 'ansible_search_path' from source: unknown 19285 1727203925.29069: calling self._execute() 19285 1727203925.29171: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203925.29184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203925.29206: variable 'omit' from source: magic vars 19285 1727203925.29602: variable 'ansible_distribution_major_version' from source: facts 19285 1727203925.29620: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203925.29822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203925.30092: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203925.30201: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203925.30204: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203925.30219: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203925.30332: variable 'network_packages' from source: role '' defaults 19285 1727203925.30441: variable '__network_provider_setup' from source: role '' defaults 19285 1727203925.30455: variable '__network_service_name_default_nm' from source: role '' defaults 19285 1727203925.30525: variable '__network_service_name_default_nm' from source: role '' defaults 19285 1727203925.30539: variable '__network_packages_default_nm' from source: role '' defaults 19285 1727203925.30980: variable '__network_packages_default_nm' from source: role '' defaults 19285 1727203925.31381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203925.43558: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203925.43634: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203925.43675: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203925.43712: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203925.43739: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203925.43809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203925.43841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203925.43872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203925.43919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203925.43938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203925.43984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203925.44012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203925.44040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203925.44083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203925.44101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203925.44309: variable '__network_packages_default_gobject_packages' from source: role '' defaults 19285 1727203925.44421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203925.44451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203925.44480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203925.44520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203925.44537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203925.44623: variable 'ansible_python' from source: facts 19285 1727203925.44650: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 19285 1727203925.44732: variable '__network_wpa_supplicant_required' from source: role '' defaults 19285 1727203925.44813: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 19285 1727203925.44935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203925.44963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203925.44995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203925.45036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203925.45055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203925.45105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203925.45140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203925.45166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203925.45211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203925.45382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203925.45386: variable 'network_connections' from source: play vars 19285 1727203925.45388: variable 'profile' from source: play vars 19285 1727203925.45480: variable 'profile' from source: play vars 19285 1727203925.45493: variable 'interface' from source: set_fact 19285 1727203925.45563: variable 'interface' from source: set_fact 19285 1727203925.45630: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203925.45662: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203925.45701: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203925.45734: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203925.45772: variable '__network_wireless_connections_defined' from source: role '' defaults 19285 1727203925.46048: variable 'network_connections' from source: play vars 19285 1727203925.46059: variable 'profile' from source: play vars 19285 1727203925.46161: variable 'profile' from source: play vars 19285 1727203925.46173: variable 'interface' from source: set_fact 19285 1727203925.46243: variable 'interface' from source: set_fact 19285 1727203925.46285: variable '__network_packages_default_wireless' from source: role '' defaults 19285 1727203925.46362: variable '__network_wireless_connections_defined' from source: role '' defaults 19285 1727203925.46671: variable 'network_connections' from source: play vars 19285 1727203925.46684: variable 'profile' from source: play vars 19285 1727203925.46746: variable 'profile' from source: play vars 19285 1727203925.46755: variable 'interface' from source: set_fact 19285 1727203925.46851: variable 'interface' from source: set_fact 19285 1727203925.46884: variable '__network_packages_default_team' from source: role '' defaults 19285 1727203925.46952: variable '__network_team_connections_defined' from source: role '' defaults 19285 1727203925.47222: variable 'network_connections' from source: play vars 19285 1727203925.47230: variable 'profile' from source: play vars 19285 1727203925.47581: variable 'profile' from source: play vars 19285 1727203925.47584: variable 'interface' from source: set_fact 19285 1727203925.47596: variable 'interface' from source: set_fact 19285 1727203925.47649: variable '__network_service_name_default_initscripts' from source: role '' defaults 19285 1727203925.47714: variable '__network_service_name_default_initscripts' from source: role '' defaults 19285 1727203925.47980: variable '__network_packages_default_initscripts' from source: role '' defaults 19285 1727203925.47983: variable '__network_packages_default_initscripts' from source: role '' defaults 19285 1727203925.48147: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 19285 1727203925.49240: variable 'network_connections' from source: play vars 19285 1727203925.49251: variable 'profile' from source: play vars 19285 1727203925.49315: variable 'profile' from source: play vars 19285 1727203925.49580: variable 'interface' from source: set_fact 19285 1727203925.49583: variable 'interface' from source: set_fact 19285 1727203925.49585: variable 'ansible_distribution' from source: facts 19285 1727203925.49587: variable '__network_rh_distros' from source: role '' defaults 19285 1727203925.49589: variable 'ansible_distribution_major_version' from source: facts 19285 1727203925.49591: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 19285 1727203925.49741: variable 'ansible_distribution' from source: facts 19285 1727203925.49987: variable '__network_rh_distros' from source: role '' defaults 19285 1727203925.49998: variable 'ansible_distribution_major_version' from source: facts 19285 1727203925.50015: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 19285 1727203925.50172: variable 'ansible_distribution' from source: facts 19285 1727203925.50288: variable '__network_rh_distros' from source: role '' defaults 19285 1727203925.50298: variable 'ansible_distribution_major_version' from source: facts 19285 1727203925.50335: variable 'network_provider' from source: set_fact 19285 1727203925.50681: variable 'ansible_facts' from source: unknown 19285 1727203925.51729: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 19285 1727203925.51738: when evaluation is False, skipping this task 19285 1727203925.51745: _execute() done 19285 1727203925.51753: dumping result to json 19285 1727203925.51760: done dumping result, returning 19285 1727203925.51772: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [028d2410-947f-f31b-fb3f-000000000042] 19285 1727203925.51783: sending task result for task 028d2410-947f-f31b-fb3f-000000000042 19285 1727203925.51889: done sending task result for task 028d2410-947f-f31b-fb3f-000000000042 19285 1727203925.51896: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 19285 1727203925.51947: no more pending results, returning what we have 19285 1727203925.51950: results queue empty 19285 1727203925.51951: checking for any_errors_fatal 19285 1727203925.52017: done checking for any_errors_fatal 19285 1727203925.52018: checking for max_fail_percentage 19285 1727203925.52020: done checking for max_fail_percentage 19285 1727203925.52021: checking to see if all hosts have failed and the running result is not ok 19285 1727203925.52022: done checking to see if all hosts have failed 19285 1727203925.52022: getting the remaining hosts for this loop 19285 1727203925.52024: done getting the remaining hosts for this loop 19285 1727203925.52028: getting the next task for host managed-node2 19285 1727203925.52033: done getting next task for host managed-node2 19285 1727203925.52037: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 19285 1727203925.52038: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203925.52051: getting variables 19285 1727203925.52053: in VariableManager get_vars() 19285 1727203925.52305: Calling all_inventory to load vars for managed-node2 19285 1727203925.52308: Calling groups_inventory to load vars for managed-node2 19285 1727203925.52310: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203925.52323: Calling all_plugins_play to load vars for managed-node2 19285 1727203925.52326: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203925.52328: Calling groups_plugins_play to load vars for managed-node2 19285 1727203925.59911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203925.61434: done with get_vars() 19285 1727203925.61465: done getting variables 19285 1727203925.61521: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:52:05 -0400 (0:00:00.338) 0:00:24.689 ***** 19285 1727203925.61554: entering _queue_task() for managed-node2/package 19285 1727203925.61916: worker is 1 (out of 1 available) 19285 1727203925.61928: exiting _queue_task() for managed-node2/package 19285 1727203925.61939: done queuing things up, now waiting for results queue to drain 19285 1727203925.61941: waiting for pending results... 19285 1727203925.62244: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 19285 1727203925.62359: in run() - task 028d2410-947f-f31b-fb3f-000000000043 19285 1727203925.62384: variable 'ansible_search_path' from source: unknown 19285 1727203925.62394: variable 'ansible_search_path' from source: unknown 19285 1727203925.62439: calling self._execute() 19285 1727203925.62548: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203925.62559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203925.62578: variable 'omit' from source: magic vars 19285 1727203925.63006: variable 'ansible_distribution_major_version' from source: facts 19285 1727203925.63041: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203925.63237: variable 'network_state' from source: role '' defaults 19285 1727203925.63241: Evaluated conditional (network_state != {}): False 19285 1727203925.63244: when evaluation is False, skipping this task 19285 1727203925.63247: _execute() done 19285 1727203925.63249: dumping result to json 19285 1727203925.63252: done dumping result, returning 19285 1727203925.63254: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [028d2410-947f-f31b-fb3f-000000000043] 19285 1727203925.63257: sending task result for task 028d2410-947f-f31b-fb3f-000000000043 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19285 1727203925.63523: no more pending results, returning what we have 19285 1727203925.63527: results queue empty 19285 1727203925.63528: checking for any_errors_fatal 19285 1727203925.63540: done checking for any_errors_fatal 19285 1727203925.63540: checking for max_fail_percentage 19285 1727203925.63542: done checking for max_fail_percentage 19285 1727203925.63543: checking to see if all hosts have failed and the running result is not ok 19285 1727203925.63544: done checking to see if all hosts have failed 19285 1727203925.63545: getting the remaining hosts for this loop 19285 1727203925.63546: done getting the remaining hosts for this loop 19285 1727203925.63550: getting the next task for host managed-node2 19285 1727203925.63556: done getting next task for host managed-node2 19285 1727203925.63562: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 19285 1727203925.63564: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203925.63725: getting variables 19285 1727203925.63727: in VariableManager get_vars() 19285 1727203925.63765: Calling all_inventory to load vars for managed-node2 19285 1727203925.63768: Calling groups_inventory to load vars for managed-node2 19285 1727203925.63770: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203925.63806: Calling all_plugins_play to load vars for managed-node2 19285 1727203925.63810: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203925.63983: Calling groups_plugins_play to load vars for managed-node2 19285 1727203925.64683: done sending task result for task 028d2410-947f-f31b-fb3f-000000000043 19285 1727203925.64687: WORKER PROCESS EXITING 19285 1727203925.65912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203925.67507: done with get_vars() 19285 1727203925.67538: done getting variables 19285 1727203925.67635: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:52:05 -0400 (0:00:00.061) 0:00:24.751 ***** 19285 1727203925.67669: entering _queue_task() for managed-node2/package 19285 1727203925.68234: worker is 1 (out of 1 available) 19285 1727203925.68246: exiting _queue_task() for managed-node2/package 19285 1727203925.68263: done queuing things up, now waiting for results queue to drain 19285 1727203925.68264: waiting for pending results... 19285 1727203925.68621: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 19285 1727203925.68735: in run() - task 028d2410-947f-f31b-fb3f-000000000044 19285 1727203925.68747: variable 'ansible_search_path' from source: unknown 19285 1727203925.68751: variable 'ansible_search_path' from source: unknown 19285 1727203925.68794: calling self._execute() 19285 1727203925.68907: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203925.68913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203925.68928: variable 'omit' from source: magic vars 19285 1727203925.69518: variable 'ansible_distribution_major_version' from source: facts 19285 1727203925.69530: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203925.69667: variable 'network_state' from source: role '' defaults 19285 1727203925.69680: Evaluated conditional (network_state != {}): False 19285 1727203925.69684: when evaluation is False, skipping this task 19285 1727203925.69687: _execute() done 19285 1727203925.69698: dumping result to json 19285 1727203925.69701: done dumping result, returning 19285 1727203925.69710: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [028d2410-947f-f31b-fb3f-000000000044] 19285 1727203925.69716: sending task result for task 028d2410-947f-f31b-fb3f-000000000044 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19285 1727203925.69939: no more pending results, returning what we have 19285 1727203925.69942: results queue empty 19285 1727203925.69943: checking for any_errors_fatal 19285 1727203925.69948: done checking for any_errors_fatal 19285 1727203925.69949: checking for max_fail_percentage 19285 1727203925.69950: done checking for max_fail_percentage 19285 1727203925.69951: checking to see if all hosts have failed and the running result is not ok 19285 1727203925.69951: done checking to see if all hosts have failed 19285 1727203925.69952: getting the remaining hosts for this loop 19285 1727203925.69954: done getting the remaining hosts for this loop 19285 1727203925.69957: getting the next task for host managed-node2 19285 1727203925.69963: done getting next task for host managed-node2 19285 1727203925.69966: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 19285 1727203925.69968: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203925.69982: getting variables 19285 1727203925.69984: in VariableManager get_vars() 19285 1727203925.70014: Calling all_inventory to load vars for managed-node2 19285 1727203925.70016: Calling groups_inventory to load vars for managed-node2 19285 1727203925.70019: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203925.70027: Calling all_plugins_play to load vars for managed-node2 19285 1727203925.70029: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203925.70032: Calling groups_plugins_play to load vars for managed-node2 19285 1727203925.70614: done sending task result for task 028d2410-947f-f31b-fb3f-000000000044 19285 1727203925.70618: WORKER PROCESS EXITING 19285 1727203925.71814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203925.73540: done with get_vars() 19285 1727203925.73565: done getting variables 19285 1727203925.73628: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:52:05 -0400 (0:00:00.059) 0:00:24.811 ***** 19285 1727203925.73669: entering _queue_task() for managed-node2/service 19285 1727203925.74059: worker is 1 (out of 1 available) 19285 1727203925.74178: exiting _queue_task() for managed-node2/service 19285 1727203925.74193: done queuing things up, now waiting for results queue to drain 19285 1727203925.74195: waiting for pending results... 19285 1727203925.74501: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 19285 1727203925.74582: in run() - task 028d2410-947f-f31b-fb3f-000000000045 19285 1727203925.74609: variable 'ansible_search_path' from source: unknown 19285 1727203925.74618: variable 'ansible_search_path' from source: unknown 19285 1727203925.74669: calling self._execute() 19285 1727203925.74796: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203925.74815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203925.74838: variable 'omit' from source: magic vars 19285 1727203925.75248: variable 'ansible_distribution_major_version' from source: facts 19285 1727203925.75274: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203925.75426: variable '__network_wireless_connections_defined' from source: role '' defaults 19285 1727203925.75656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203925.77955: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203925.78051: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203925.78099: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203925.78233: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203925.78236: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203925.78267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203925.78311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203925.78356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203925.78422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203925.78452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203925.78519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203925.78547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203925.78606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203925.78665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203925.78699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203925.78773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203925.78779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203925.78809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203925.78854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203925.78881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203925.79115: variable 'network_connections' from source: play vars 19285 1727203925.79119: variable 'profile' from source: play vars 19285 1727203925.79168: variable 'profile' from source: play vars 19285 1727203925.79180: variable 'interface' from source: set_fact 19285 1727203925.79252: variable 'interface' from source: set_fact 19285 1727203925.79332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203925.79507: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203925.79780: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203925.79784: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203925.79786: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203925.79788: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203925.79790: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203925.79792: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203925.79794: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203925.79796: variable '__network_team_connections_defined' from source: role '' defaults 19285 1727203925.80119: variable 'network_connections' from source: play vars 19285 1727203925.80133: variable 'profile' from source: play vars 19285 1727203925.80233: variable 'profile' from source: play vars 19285 1727203925.80247: variable 'interface' from source: set_fact 19285 1727203925.80303: variable 'interface' from source: set_fact 19285 1727203925.80336: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 19285 1727203925.80339: when evaluation is False, skipping this task 19285 1727203925.80341: _execute() done 19285 1727203925.80344: dumping result to json 19285 1727203925.80346: done dumping result, returning 19285 1727203925.80357: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [028d2410-947f-f31b-fb3f-000000000045] 19285 1727203925.80370: sending task result for task 028d2410-947f-f31b-fb3f-000000000045 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 19285 1727203925.80535: no more pending results, returning what we have 19285 1727203925.80538: results queue empty 19285 1727203925.80539: checking for any_errors_fatal 19285 1727203925.80545: done checking for any_errors_fatal 19285 1727203925.80546: checking for max_fail_percentage 19285 1727203925.80548: done checking for max_fail_percentage 19285 1727203925.80549: checking to see if all hosts have failed and the running result is not ok 19285 1727203925.80549: done checking to see if all hosts have failed 19285 1727203925.80550: getting the remaining hosts for this loop 19285 1727203925.80552: done getting the remaining hosts for this loop 19285 1727203925.80555: getting the next task for host managed-node2 19285 1727203925.80564: done getting next task for host managed-node2 19285 1727203925.80569: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 19285 1727203925.80571: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203925.80589: getting variables 19285 1727203925.80591: in VariableManager get_vars() 19285 1727203925.80633: Calling all_inventory to load vars for managed-node2 19285 1727203925.80637: Calling groups_inventory to load vars for managed-node2 19285 1727203925.80640: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203925.80650: Calling all_plugins_play to load vars for managed-node2 19285 1727203925.80652: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203925.80659: Calling groups_plugins_play to load vars for managed-node2 19285 1727203925.81300: done sending task result for task 028d2410-947f-f31b-fb3f-000000000045 19285 1727203925.81304: WORKER PROCESS EXITING 19285 1727203925.82172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203925.83578: done with get_vars() 19285 1727203925.83593: done getting variables 19285 1727203925.83649: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:52:05 -0400 (0:00:00.100) 0:00:24.911 ***** 19285 1727203925.83675: entering _queue_task() for managed-node2/service 19285 1727203925.83919: worker is 1 (out of 1 available) 19285 1727203925.83933: exiting _queue_task() for managed-node2/service 19285 1727203925.83946: done queuing things up, now waiting for results queue to drain 19285 1727203925.83947: waiting for pending results... 19285 1727203925.84119: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 19285 1727203925.84183: in run() - task 028d2410-947f-f31b-fb3f-000000000046 19285 1727203925.84194: variable 'ansible_search_path' from source: unknown 19285 1727203925.84198: variable 'ansible_search_path' from source: unknown 19285 1727203925.84309: calling self._execute() 19285 1727203925.84370: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203925.84374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203925.84378: variable 'omit' from source: magic vars 19285 1727203925.84739: variable 'ansible_distribution_major_version' from source: facts 19285 1727203925.84743: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203925.84891: variable 'network_provider' from source: set_fact 19285 1727203925.84895: variable 'network_state' from source: role '' defaults 19285 1727203925.84898: Evaluated conditional (network_provider == "nm" or network_state != {}): True 19285 1727203925.85034: variable 'omit' from source: magic vars 19285 1727203925.85037: variable 'omit' from source: magic vars 19285 1727203925.85042: variable 'network_service_name' from source: role '' defaults 19285 1727203925.85044: variable 'network_service_name' from source: role '' defaults 19285 1727203925.85296: variable '__network_provider_setup' from source: role '' defaults 19285 1727203925.85299: variable '__network_service_name_default_nm' from source: role '' defaults 19285 1727203925.85302: variable '__network_service_name_default_nm' from source: role '' defaults 19285 1727203925.85304: variable '__network_packages_default_nm' from source: role '' defaults 19285 1727203925.85307: variable '__network_packages_default_nm' from source: role '' defaults 19285 1727203925.85522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203925.87626: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203925.87682: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203925.87721: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203925.87761: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203925.87800: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203925.87867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203925.87894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203925.87920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203925.87996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203925.87999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203925.88107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203925.88110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203925.88119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203925.88181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203925.88185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203925.88612: variable '__network_packages_default_gobject_packages' from source: role '' defaults 19285 1727203925.88617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203925.88620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203925.88623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203925.88671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203925.88691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203925.88764: variable 'ansible_python' from source: facts 19285 1727203925.88799: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 19285 1727203925.88876: variable '__network_wpa_supplicant_required' from source: role '' defaults 19285 1727203925.88956: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 19285 1727203925.89080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203925.89102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203925.89123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203925.89163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203925.89178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203925.89381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203925.89391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203925.89393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203925.89396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203925.89398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203925.89429: variable 'network_connections' from source: play vars 19285 1727203925.89435: variable 'profile' from source: play vars 19285 1727203925.89547: variable 'profile' from source: play vars 19285 1727203925.89551: variable 'interface' from source: set_fact 19285 1727203925.89624: variable 'interface' from source: set_fact 19285 1727203925.89677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203925.89862: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203925.89912: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203925.89951: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203925.90032: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203925.90054: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203925.90083: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203925.90115: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203925.90147: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203925.90271: variable '__network_wireless_connections_defined' from source: role '' defaults 19285 1727203925.90471: variable 'network_connections' from source: play vars 19285 1727203925.90478: variable 'profile' from source: play vars 19285 1727203925.90532: variable 'profile' from source: play vars 19285 1727203925.90535: variable 'interface' from source: set_fact 19285 1727203925.90596: variable 'interface' from source: set_fact 19285 1727203925.90621: variable '__network_packages_default_wireless' from source: role '' defaults 19285 1727203925.90678: variable '__network_wireless_connections_defined' from source: role '' defaults 19285 1727203925.90862: variable 'network_connections' from source: play vars 19285 1727203925.90869: variable 'profile' from source: play vars 19285 1727203925.90920: variable 'profile' from source: play vars 19285 1727203925.90926: variable 'interface' from source: set_fact 19285 1727203925.90977: variable 'interface' from source: set_fact 19285 1727203925.90996: variable '__network_packages_default_team' from source: role '' defaults 19285 1727203925.91051: variable '__network_team_connections_defined' from source: role '' defaults 19285 1727203925.91234: variable 'network_connections' from source: play vars 19285 1727203925.91237: variable 'profile' from source: play vars 19285 1727203925.91292: variable 'profile' from source: play vars 19285 1727203925.91296: variable 'interface' from source: set_fact 19285 1727203925.91345: variable 'interface' from source: set_fact 19285 1727203925.91390: variable '__network_service_name_default_initscripts' from source: role '' defaults 19285 1727203925.91430: variable '__network_service_name_default_initscripts' from source: role '' defaults 19285 1727203925.91436: variable '__network_packages_default_initscripts' from source: role '' defaults 19285 1727203925.91483: variable '__network_packages_default_initscripts' from source: role '' defaults 19285 1727203925.91615: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 19285 1727203925.91929: variable 'network_connections' from source: play vars 19285 1727203925.91933: variable 'profile' from source: play vars 19285 1727203925.91979: variable 'profile' from source: play vars 19285 1727203925.91982: variable 'interface' from source: set_fact 19285 1727203925.92035: variable 'interface' from source: set_fact 19285 1727203925.92042: variable 'ansible_distribution' from source: facts 19285 1727203925.92045: variable '__network_rh_distros' from source: role '' defaults 19285 1727203925.92050: variable 'ansible_distribution_major_version' from source: facts 19285 1727203925.92061: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 19285 1727203925.92253: variable 'ansible_distribution' from source: facts 19285 1727203925.92257: variable '__network_rh_distros' from source: role '' defaults 19285 1727203925.92260: variable 'ansible_distribution_major_version' from source: facts 19285 1727203925.92262: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 19285 1727203925.92434: variable 'ansible_distribution' from source: facts 19285 1727203925.92438: variable '__network_rh_distros' from source: role '' defaults 19285 1727203925.92444: variable 'ansible_distribution_major_version' from source: facts 19285 1727203925.92447: variable 'network_provider' from source: set_fact 19285 1727203925.92469: variable 'omit' from source: magic vars 19285 1727203925.92591: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203925.92788: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203925.92792: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203925.92794: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203925.92796: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203925.92798: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203925.92800: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203925.92802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203925.92870: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203925.92885: Set connection var ansible_pipelining to False 19285 1727203925.92895: Set connection var ansible_timeout to 10 19285 1727203925.92902: Set connection var ansible_shell_type to sh 19285 1727203925.92919: Set connection var ansible_shell_executable to /bin/sh 19285 1727203925.92928: Set connection var ansible_connection to ssh 19285 1727203925.92954: variable 'ansible_shell_executable' from source: unknown 19285 1727203925.92961: variable 'ansible_connection' from source: unknown 19285 1727203925.92968: variable 'ansible_module_compression' from source: unknown 19285 1727203925.92975: variable 'ansible_shell_type' from source: unknown 19285 1727203925.92984: variable 'ansible_shell_executable' from source: unknown 19285 1727203925.93123: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203925.93129: variable 'ansible_pipelining' from source: unknown 19285 1727203925.93131: variable 'ansible_timeout' from source: unknown 19285 1727203925.93133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203925.93231: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203925.93282: variable 'omit' from source: magic vars 19285 1727203925.93285: starting attempt loop 19285 1727203925.93287: running the handler 19285 1727203925.93345: variable 'ansible_facts' from source: unknown 19285 1727203925.93830: _low_level_execute_command(): starting 19285 1727203925.93833: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203925.94333: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203925.94338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 19285 1727203925.94341: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 19285 1727203925.94345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203925.94396: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203925.94400: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203925.94405: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203925.94489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203925.96227: stdout chunk (state=3): >>>/root <<< 19285 1727203925.96358: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203925.96372: stderr chunk (state=3): >>><<< 19285 1727203925.96401: stdout chunk (state=3): >>><<< 19285 1727203925.96422: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203925.96441: _low_level_execute_command(): starting 19285 1727203925.96444: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203925.964207-21315-252567880653139 `" && echo ansible-tmp-1727203925.964207-21315-252567880653139="` echo /root/.ansible/tmp/ansible-tmp-1727203925.964207-21315-252567880653139 `" ) && sleep 0' 19285 1727203925.97302: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203925.97419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203925.97424: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203925.97469: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203925.97564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203925.99516: stdout chunk (state=3): >>>ansible-tmp-1727203925.964207-21315-252567880653139=/root/.ansible/tmp/ansible-tmp-1727203925.964207-21315-252567880653139 <<< 19285 1727203925.99639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203925.99689: stderr chunk (state=3): >>><<< 19285 1727203925.99693: stdout chunk (state=3): >>><<< 19285 1727203925.99716: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203925.964207-21315-252567880653139=/root/.ansible/tmp/ansible-tmp-1727203925.964207-21315-252567880653139 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203925.99739: variable 'ansible_module_compression' from source: unknown 19285 1727203925.99788: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 19285 1727203925.99841: variable 'ansible_facts' from source: unknown 19285 1727203925.99973: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203925.964207-21315-252567880653139/AnsiballZ_systemd.py 19285 1727203926.00077: Sending initial data 19285 1727203926.00081: Sent initial data (155 bytes) 19285 1727203926.00689: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203926.00693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 19285 1727203926.00699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203926.00714: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203926.00728: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203926.00825: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203926.02436: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 19285 1727203926.02443: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203926.02507: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203926.02585: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmpw5h8dl6v /root/.ansible/tmp/ansible-tmp-1727203925.964207-21315-252567880653139/AnsiballZ_systemd.py <<< 19285 1727203926.02588: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203925.964207-21315-252567880653139/AnsiballZ_systemd.py" <<< 19285 1727203926.02652: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmpw5h8dl6v" to remote "/root/.ansible/tmp/ansible-tmp-1727203925.964207-21315-252567880653139/AnsiballZ_systemd.py" <<< 19285 1727203926.02656: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203925.964207-21315-252567880653139/AnsiballZ_systemd.py" <<< 19285 1727203926.04737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203926.04740: stderr chunk (state=3): >>><<< 19285 1727203926.04743: stdout chunk (state=3): >>><<< 19285 1727203926.04796: done transferring module to remote 19285 1727203926.04806: _low_level_execute_command(): starting 19285 1727203926.04816: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203925.964207-21315-252567880653139/ /root/.ansible/tmp/ansible-tmp-1727203925.964207-21315-252567880653139/AnsiballZ_systemd.py && sleep 0' 19285 1727203926.05787: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203926.05791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203926.05913: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203926.05917: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203926.05923: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203926.05956: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203926.06022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203926.08340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203926.08344: stdout chunk (state=3): >>><<< 19285 1727203926.08346: stderr chunk (state=3): >>><<< 19285 1727203926.08348: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203926.08350: _low_level_execute_command(): starting 19285 1727203926.08352: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203925.964207-21315-252567880653139/AnsiballZ_systemd.py && sleep 0' 19285 1727203926.08982: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203926.08986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203926.08988: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203926.08990: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19285 1727203926.08992: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203926.09001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203926.09062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203926.09065: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203926.09138: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203926.09260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203926.38277: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "7081", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainStartTimestampMonotonic": "294798591", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainHandoffTimestampMonotonic": "294813549", "ExecMainPID": "7081", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4312", "MemoryCurrent": "4493312", "MemoryPeak": "7655424", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3310206976", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "616948000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredump<<< 19285 1727203926.38283: stdout chunk (state=3): >>>Receive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target cloud-init.service multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "sysinit.target systemd-journald.socket basic.target network-pre.target system.slice cloud-init-local.service dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:13 EDT", "StateChangeTimestampMonotonic": "399463156", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveExitTimestampMonotonic": "294799297", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveEnterTimestampMonotonic": "294888092", "ActiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveExitTimestampMonotonic": "294768391", "InactiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveEnterTimestampMonotonic": "294795966", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ConditionTimestampMonotonic": "294797207", "AssertTimestamp": "Tue 2024-09-24 14:48:28 EDT", "AssertTimestampMonotonic": "294797210", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a167241d4c7945a58749ffeda353964d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 19285 1727203926.40052: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203926.40082: stderr chunk (state=3): >>><<< 19285 1727203926.40085: stdout chunk (state=3): >>><<< 19285 1727203926.40106: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "7081", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainStartTimestampMonotonic": "294798591", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainHandoffTimestampMonotonic": "294813549", "ExecMainPID": "7081", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4312", "MemoryCurrent": "4493312", "MemoryPeak": "7655424", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3310206976", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "616948000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target cloud-init.service multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "sysinit.target systemd-journald.socket basic.target network-pre.target system.slice cloud-init-local.service dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:13 EDT", "StateChangeTimestampMonotonic": "399463156", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveExitTimestampMonotonic": "294799297", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveEnterTimestampMonotonic": "294888092", "ActiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveExitTimestampMonotonic": "294768391", "InactiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveEnterTimestampMonotonic": "294795966", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ConditionTimestampMonotonic": "294797207", "AssertTimestamp": "Tue 2024-09-24 14:48:28 EDT", "AssertTimestampMonotonic": "294797210", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a167241d4c7945a58749ffeda353964d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203926.40224: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203925.964207-21315-252567880653139/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203926.40240: _low_level_execute_command(): starting 19285 1727203926.40244: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203925.964207-21315-252567880653139/ > /dev/null 2>&1 && sleep 0' 19285 1727203926.40698: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203926.40702: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203926.40704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203926.40706: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203926.40708: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203926.40710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 19285 1727203926.40712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203926.40757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203926.40761: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203926.40763: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203926.40844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203926.42703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203926.42730: stderr chunk (state=3): >>><<< 19285 1727203926.42732: stdout chunk (state=3): >>><<< 19285 1727203926.42741: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203926.42774: handler run complete 19285 1727203926.42796: attempt loop complete, returning result 19285 1727203926.42799: _execute() done 19285 1727203926.42801: dumping result to json 19285 1727203926.42812: done dumping result, returning 19285 1727203926.42821: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [028d2410-947f-f31b-fb3f-000000000046] 19285 1727203926.42825: sending task result for task 028d2410-947f-f31b-fb3f-000000000046 19285 1727203926.43059: done sending task result for task 028d2410-947f-f31b-fb3f-000000000046 19285 1727203926.43064: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19285 1727203926.43112: no more pending results, returning what we have 19285 1727203926.43115: results queue empty 19285 1727203926.43116: checking for any_errors_fatal 19285 1727203926.43122: done checking for any_errors_fatal 19285 1727203926.43122: checking for max_fail_percentage 19285 1727203926.43124: done checking for max_fail_percentage 19285 1727203926.43124: checking to see if all hosts have failed and the running result is not ok 19285 1727203926.43125: done checking to see if all hosts have failed 19285 1727203926.43126: getting the remaining hosts for this loop 19285 1727203926.43128: done getting the remaining hosts for this loop 19285 1727203926.43131: getting the next task for host managed-node2 19285 1727203926.43136: done getting next task for host managed-node2 19285 1727203926.43140: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 19285 1727203926.43142: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203926.43150: getting variables 19285 1727203926.43152: in VariableManager get_vars() 19285 1727203926.43185: Calling all_inventory to load vars for managed-node2 19285 1727203926.43187: Calling groups_inventory to load vars for managed-node2 19285 1727203926.43189: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203926.43198: Calling all_plugins_play to load vars for managed-node2 19285 1727203926.43201: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203926.43203: Calling groups_plugins_play to load vars for managed-node2 19285 1727203926.43991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203926.45274: done with get_vars() 19285 1727203926.45299: done getting variables 19285 1727203926.45352: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:52:06 -0400 (0:00:00.617) 0:00:25.528 ***** 19285 1727203926.45378: entering _queue_task() for managed-node2/service 19285 1727203926.45610: worker is 1 (out of 1 available) 19285 1727203926.45624: exiting _queue_task() for managed-node2/service 19285 1727203926.45637: done queuing things up, now waiting for results queue to drain 19285 1727203926.45638: waiting for pending results... 19285 1727203926.45821: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 19285 1727203926.45893: in run() - task 028d2410-947f-f31b-fb3f-000000000047 19285 1727203926.45912: variable 'ansible_search_path' from source: unknown 19285 1727203926.45916: variable 'ansible_search_path' from source: unknown 19285 1727203926.45939: calling self._execute() 19285 1727203926.46019: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203926.46024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203926.46027: variable 'omit' from source: magic vars 19285 1727203926.46301: variable 'ansible_distribution_major_version' from source: facts 19285 1727203926.46309: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203926.46388: variable 'network_provider' from source: set_fact 19285 1727203926.46392: Evaluated conditional (network_provider == "nm"): True 19285 1727203926.46456: variable '__network_wpa_supplicant_required' from source: role '' defaults 19285 1727203926.46517: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 19285 1727203926.46634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203926.48651: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203926.48680: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203926.48698: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203926.48724: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203926.48744: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203926.48804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203926.48826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203926.48845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203926.48873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203926.48887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203926.48998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203926.49001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203926.49004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203926.49069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203926.49100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203926.49190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203926.49193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203926.49195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203926.49217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203926.49254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203926.49428: variable 'network_connections' from source: play vars 19285 1727203926.49431: variable 'profile' from source: play vars 19285 1727203926.49510: variable 'profile' from source: play vars 19285 1727203926.49513: variable 'interface' from source: set_fact 19285 1727203926.49538: variable 'interface' from source: set_fact 19285 1727203926.49747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203926.49886: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203926.49890: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203926.49980: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203926.49983: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203926.49989: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203926.50030: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203926.50089: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203926.50251: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203926.50267: variable '__network_wireless_connections_defined' from source: role '' defaults 19285 1727203926.50459: variable 'network_connections' from source: play vars 19285 1727203926.50483: variable 'profile' from source: play vars 19285 1727203926.50562: variable 'profile' from source: play vars 19285 1727203926.50565: variable 'interface' from source: set_fact 19285 1727203926.50645: variable 'interface' from source: set_fact 19285 1727203926.50648: Evaluated conditional (__network_wpa_supplicant_required): False 19285 1727203926.50651: when evaluation is False, skipping this task 19285 1727203926.50654: _execute() done 19285 1727203926.50666: dumping result to json 19285 1727203926.50668: done dumping result, returning 19285 1727203926.50701: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [028d2410-947f-f31b-fb3f-000000000047] 19285 1727203926.50704: sending task result for task 028d2410-947f-f31b-fb3f-000000000047 19285 1727203926.50855: done sending task result for task 028d2410-947f-f31b-fb3f-000000000047 19285 1727203926.50858: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 19285 1727203926.50966: no more pending results, returning what we have 19285 1727203926.50970: results queue empty 19285 1727203926.50971: checking for any_errors_fatal 19285 1727203926.50994: done checking for any_errors_fatal 19285 1727203926.50995: checking for max_fail_percentage 19285 1727203926.50996: done checking for max_fail_percentage 19285 1727203926.50997: checking to see if all hosts have failed and the running result is not ok 19285 1727203926.50998: done checking to see if all hosts have failed 19285 1727203926.50999: getting the remaining hosts for this loop 19285 1727203926.51001: done getting the remaining hosts for this loop 19285 1727203926.51004: getting the next task for host managed-node2 19285 1727203926.51008: done getting next task for host managed-node2 19285 1727203926.51012: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 19285 1727203926.51014: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203926.51027: getting variables 19285 1727203926.51029: in VariableManager get_vars() 19285 1727203926.51064: Calling all_inventory to load vars for managed-node2 19285 1727203926.51067: Calling groups_inventory to load vars for managed-node2 19285 1727203926.51069: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203926.51095: Calling all_plugins_play to load vars for managed-node2 19285 1727203926.51099: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203926.51103: Calling groups_plugins_play to load vars for managed-node2 19285 1727203926.52320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203926.53267: done with get_vars() 19285 1727203926.53289: done getting variables 19285 1727203926.53345: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:52:06 -0400 (0:00:00.079) 0:00:25.608 ***** 19285 1727203926.53365: entering _queue_task() for managed-node2/service 19285 1727203926.53628: worker is 1 (out of 1 available) 19285 1727203926.53647: exiting _queue_task() for managed-node2/service 19285 1727203926.53659: done queuing things up, now waiting for results queue to drain 19285 1727203926.53661: waiting for pending results... 19285 1727203926.53889: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 19285 1727203926.53958: in run() - task 028d2410-947f-f31b-fb3f-000000000048 19285 1727203926.53971: variable 'ansible_search_path' from source: unknown 19285 1727203926.53976: variable 'ansible_search_path' from source: unknown 19285 1727203926.54004: calling self._execute() 19285 1727203926.54082: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203926.54086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203926.54095: variable 'omit' from source: magic vars 19285 1727203926.54357: variable 'ansible_distribution_major_version' from source: facts 19285 1727203926.54369: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203926.54448: variable 'network_provider' from source: set_fact 19285 1727203926.54452: Evaluated conditional (network_provider == "initscripts"): False 19285 1727203926.54455: when evaluation is False, skipping this task 19285 1727203926.54458: _execute() done 19285 1727203926.54460: dumping result to json 19285 1727203926.54465: done dumping result, returning 19285 1727203926.54471: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [028d2410-947f-f31b-fb3f-000000000048] 19285 1727203926.54479: sending task result for task 028d2410-947f-f31b-fb3f-000000000048 19285 1727203926.54558: done sending task result for task 028d2410-947f-f31b-fb3f-000000000048 19285 1727203926.54561: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19285 1727203926.54606: no more pending results, returning what we have 19285 1727203926.54610: results queue empty 19285 1727203926.54611: checking for any_errors_fatal 19285 1727203926.54621: done checking for any_errors_fatal 19285 1727203926.54622: checking for max_fail_percentage 19285 1727203926.54623: done checking for max_fail_percentage 19285 1727203926.54625: checking to see if all hosts have failed and the running result is not ok 19285 1727203926.54626: done checking to see if all hosts have failed 19285 1727203926.54627: getting the remaining hosts for this loop 19285 1727203926.54629: done getting the remaining hosts for this loop 19285 1727203926.54632: getting the next task for host managed-node2 19285 1727203926.54638: done getting next task for host managed-node2 19285 1727203926.54641: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 19285 1727203926.54644: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203926.54657: getting variables 19285 1727203926.54658: in VariableManager get_vars() 19285 1727203926.54696: Calling all_inventory to load vars for managed-node2 19285 1727203926.54699: Calling groups_inventory to load vars for managed-node2 19285 1727203926.54701: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203926.54709: Calling all_plugins_play to load vars for managed-node2 19285 1727203926.54711: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203926.54713: Calling groups_plugins_play to load vars for managed-node2 19285 1727203926.55463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203926.56414: done with get_vars() 19285 1727203926.56428: done getting variables 19285 1727203926.56498: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:52:06 -0400 (0:00:00.031) 0:00:25.639 ***** 19285 1727203926.56535: entering _queue_task() for managed-node2/copy 19285 1727203926.56844: worker is 1 (out of 1 available) 19285 1727203926.56857: exiting _queue_task() for managed-node2/copy 19285 1727203926.56872: done queuing things up, now waiting for results queue to drain 19285 1727203926.56873: waiting for pending results... 19285 1727203926.57149: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 19285 1727203926.57255: in run() - task 028d2410-947f-f31b-fb3f-000000000049 19285 1727203926.57277: variable 'ansible_search_path' from source: unknown 19285 1727203926.57281: variable 'ansible_search_path' from source: unknown 19285 1727203926.57324: calling self._execute() 19285 1727203926.57425: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203926.57495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203926.57500: variable 'omit' from source: magic vars 19285 1727203926.57866: variable 'ansible_distribution_major_version' from source: facts 19285 1727203926.58058: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203926.58061: variable 'network_provider' from source: set_fact 19285 1727203926.58064: Evaluated conditional (network_provider == "initscripts"): False 19285 1727203926.58066: when evaluation is False, skipping this task 19285 1727203926.58069: _execute() done 19285 1727203926.58071: dumping result to json 19285 1727203926.58073: done dumping result, returning 19285 1727203926.58078: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [028d2410-947f-f31b-fb3f-000000000049] 19285 1727203926.58081: sending task result for task 028d2410-947f-f31b-fb3f-000000000049 19285 1727203926.58150: done sending task result for task 028d2410-947f-f31b-fb3f-000000000049 19285 1727203926.58153: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 19285 1727203926.58201: no more pending results, returning what we have 19285 1727203926.58205: results queue empty 19285 1727203926.58206: checking for any_errors_fatal 19285 1727203926.58216: done checking for any_errors_fatal 19285 1727203926.58217: checking for max_fail_percentage 19285 1727203926.58221: done checking for max_fail_percentage 19285 1727203926.58222: checking to see if all hosts have failed and the running result is not ok 19285 1727203926.58225: done checking to see if all hosts have failed 19285 1727203926.58225: getting the remaining hosts for this loop 19285 1727203926.58227: done getting the remaining hosts for this loop 19285 1727203926.58231: getting the next task for host managed-node2 19285 1727203926.58237: done getting next task for host managed-node2 19285 1727203926.58240: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 19285 1727203926.58242: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203926.58257: getting variables 19285 1727203926.58258: in VariableManager get_vars() 19285 1727203926.58377: Calling all_inventory to load vars for managed-node2 19285 1727203926.58379: Calling groups_inventory to load vars for managed-node2 19285 1727203926.58381: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203926.58391: Calling all_plugins_play to load vars for managed-node2 19285 1727203926.58393: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203926.58397: Calling groups_plugins_play to load vars for managed-node2 19285 1727203926.59400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203926.60303: done with get_vars() 19285 1727203926.60318: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:52:06 -0400 (0:00:00.038) 0:00:25.678 ***** 19285 1727203926.60375: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 19285 1727203926.60587: worker is 1 (out of 1 available) 19285 1727203926.60600: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 19285 1727203926.60611: done queuing things up, now waiting for results queue to drain 19285 1727203926.60612: waiting for pending results... 19285 1727203926.60834: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 19285 1727203926.60906: in run() - task 028d2410-947f-f31b-fb3f-00000000004a 19285 1727203926.60910: variable 'ansible_search_path' from source: unknown 19285 1727203926.60913: variable 'ansible_search_path' from source: unknown 19285 1727203926.60942: calling self._execute() 19285 1727203926.61088: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203926.61095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203926.61098: variable 'omit' from source: magic vars 19285 1727203926.61370: variable 'ansible_distribution_major_version' from source: facts 19285 1727203926.61382: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203926.61388: variable 'omit' from source: magic vars 19285 1727203926.61421: variable 'omit' from source: magic vars 19285 1727203926.61571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203926.63149: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203926.63198: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203926.63245: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203926.63264: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203926.63288: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203926.63349: variable 'network_provider' from source: set_fact 19285 1727203926.63443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203926.63488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203926.63516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203926.63537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203926.63569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203926.63624: variable 'omit' from source: magic vars 19285 1727203926.63718: variable 'omit' from source: magic vars 19285 1727203926.63828: variable 'network_connections' from source: play vars 19285 1727203926.63831: variable 'profile' from source: play vars 19285 1727203926.63893: variable 'profile' from source: play vars 19285 1727203926.63897: variable 'interface' from source: set_fact 19285 1727203926.63964: variable 'interface' from source: set_fact 19285 1727203926.64085: variable 'omit' from source: magic vars 19285 1727203926.64099: variable '__lsr_ansible_managed' from source: task vars 19285 1727203926.64146: variable '__lsr_ansible_managed' from source: task vars 19285 1727203926.64295: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 19285 1727203926.64824: Loaded config def from plugin (lookup/template) 19285 1727203926.64828: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 19285 1727203926.64861: File lookup term: get_ansible_managed.j2 19285 1727203926.64864: variable 'ansible_search_path' from source: unknown 19285 1727203926.64867: evaluation_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 19285 1727203926.64883: search_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 19285 1727203926.64901: variable 'ansible_search_path' from source: unknown 19285 1727203926.69725: variable 'ansible_managed' from source: unknown 19285 1727203926.69805: variable 'omit' from source: magic vars 19285 1727203926.69826: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203926.69846: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203926.69864: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203926.69879: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203926.69888: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203926.69912: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203926.69915: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203926.69917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203926.69985: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203926.69991: Set connection var ansible_pipelining to False 19285 1727203926.69997: Set connection var ansible_timeout to 10 19285 1727203926.69999: Set connection var ansible_shell_type to sh 19285 1727203926.70005: Set connection var ansible_shell_executable to /bin/sh 19285 1727203926.70009: Set connection var ansible_connection to ssh 19285 1727203926.70027: variable 'ansible_shell_executable' from source: unknown 19285 1727203926.70030: variable 'ansible_connection' from source: unknown 19285 1727203926.70033: variable 'ansible_module_compression' from source: unknown 19285 1727203926.70035: variable 'ansible_shell_type' from source: unknown 19285 1727203926.70037: variable 'ansible_shell_executable' from source: unknown 19285 1727203926.70039: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203926.70043: variable 'ansible_pipelining' from source: unknown 19285 1727203926.70046: variable 'ansible_timeout' from source: unknown 19285 1727203926.70050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203926.70141: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19285 1727203926.70151: variable 'omit' from source: magic vars 19285 1727203926.70154: starting attempt loop 19285 1727203926.70157: running the handler 19285 1727203926.70171: _low_level_execute_command(): starting 19285 1727203926.70178: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203926.70677: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203926.70681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203926.70683: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203926.70685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 19285 1727203926.70688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203926.70737: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203926.70740: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203926.70743: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203926.70824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203926.72543: stdout chunk (state=3): >>>/root <<< 19285 1727203926.72698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203926.72701: stdout chunk (state=3): >>><<< 19285 1727203926.72704: stderr chunk (state=3): >>><<< 19285 1727203926.72809: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203926.72817: _low_level_execute_command(): starting 19285 1727203926.72823: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203926.7272575-21349-88826463984652 `" && echo ansible-tmp-1727203926.7272575-21349-88826463984652="` echo /root/.ansible/tmp/ansible-tmp-1727203926.7272575-21349-88826463984652 `" ) && sleep 0' 19285 1727203926.73471: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203926.73490: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203926.73514: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203926.73529: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203926.73638: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203926.75615: stdout chunk (state=3): >>>ansible-tmp-1727203926.7272575-21349-88826463984652=/root/.ansible/tmp/ansible-tmp-1727203926.7272575-21349-88826463984652 <<< 19285 1727203926.75781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203926.75793: stdout chunk (state=3): >>><<< 19285 1727203926.75814: stderr chunk (state=3): >>><<< 19285 1727203926.75981: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203926.7272575-21349-88826463984652=/root/.ansible/tmp/ansible-tmp-1727203926.7272575-21349-88826463984652 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203926.75984: variable 'ansible_module_compression' from source: unknown 19285 1727203926.75986: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 19285 1727203926.75988: variable 'ansible_facts' from source: unknown 19285 1727203926.76263: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203926.7272575-21349-88826463984652/AnsiballZ_network_connections.py 19285 1727203926.76398: Sending initial data 19285 1727203926.76499: Sent initial data (167 bytes) 19285 1727203926.77059: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203926.77165: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203926.77189: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203926.77206: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203926.77227: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203926.77327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203926.78960: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203926.79045: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203926.79142: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmpc8pxq8_4 /root/.ansible/tmp/ansible-tmp-1727203926.7272575-21349-88826463984652/AnsiballZ_network_connections.py <<< 19285 1727203926.79153: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203926.7272575-21349-88826463984652/AnsiballZ_network_connections.py" <<< 19285 1727203926.79208: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 19285 1727203926.79230: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmpc8pxq8_4" to remote "/root/.ansible/tmp/ansible-tmp-1727203926.7272575-21349-88826463984652/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203926.7272575-21349-88826463984652/AnsiballZ_network_connections.py" <<< 19285 1727203926.80507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203926.80636: stderr chunk (state=3): >>><<< 19285 1727203926.80639: stdout chunk (state=3): >>><<< 19285 1727203926.80642: done transferring module to remote 19285 1727203926.80644: _low_level_execute_command(): starting 19285 1727203926.80646: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203926.7272575-21349-88826463984652/ /root/.ansible/tmp/ansible-tmp-1727203926.7272575-21349-88826463984652/AnsiballZ_network_connections.py && sleep 0' 19285 1727203926.81218: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203926.81240: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203926.81258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203926.81364: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203926.83212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203926.83272: stderr chunk (state=3): >>><<< 19285 1727203926.83291: stdout chunk (state=3): >>><<< 19285 1727203926.83315: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203926.83325: _low_level_execute_command(): starting 19285 1727203926.83334: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203926.7272575-21349-88826463984652/AnsiballZ_network_connections.py && sleep 0' 19285 1727203926.83983: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203926.84057: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203926.84109: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203926.84124: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203926.84144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203926.84270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203927.14829: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 19285 1727203927.17298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203927.17340: stderr chunk (state=3): >>><<< 19285 1727203927.17343: stdout chunk (state=3): >>><<< 19285 1727203927.17372: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203927.17401: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203926.7272575-21349-88826463984652/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203927.17412: _low_level_execute_command(): starting 19285 1727203927.17435: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203926.7272575-21349-88826463984652/ > /dev/null 2>&1 && sleep 0' 19285 1727203927.17962: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203927.17965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 19285 1727203927.17968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203927.17970: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203927.17972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203927.18030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203927.18036: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203927.18040: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203927.18109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203927.20001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203927.20036: stderr chunk (state=3): >>><<< 19285 1727203927.20039: stdout chunk (state=3): >>><<< 19285 1727203927.20052: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203927.20062: handler run complete 19285 1727203927.20118: attempt loop complete, returning result 19285 1727203927.20121: _execute() done 19285 1727203927.20124: dumping result to json 19285 1727203927.20125: done dumping result, returning 19285 1727203927.20127: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [028d2410-947f-f31b-fb3f-00000000004a] 19285 1727203927.20129: sending task result for task 028d2410-947f-f31b-fb3f-00000000004a 19285 1727203927.20218: done sending task result for task 028d2410-947f-f31b-fb3f-00000000004a 19285 1727203927.20221: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 19285 1727203927.20311: no more pending results, returning what we have 19285 1727203927.20315: results queue empty 19285 1727203927.20315: checking for any_errors_fatal 19285 1727203927.20321: done checking for any_errors_fatal 19285 1727203927.20322: checking for max_fail_percentage 19285 1727203927.20323: done checking for max_fail_percentage 19285 1727203927.20324: checking to see if all hosts have failed and the running result is not ok 19285 1727203927.20325: done checking to see if all hosts have failed 19285 1727203927.20325: getting the remaining hosts for this loop 19285 1727203927.20327: done getting the remaining hosts for this loop 19285 1727203927.20335: getting the next task for host managed-node2 19285 1727203927.20344: done getting next task for host managed-node2 19285 1727203927.20347: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 19285 1727203927.20349: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203927.20361: getting variables 19285 1727203927.20363: in VariableManager get_vars() 19285 1727203927.20404: Calling all_inventory to load vars for managed-node2 19285 1727203927.20407: Calling groups_inventory to load vars for managed-node2 19285 1727203927.20409: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203927.20417: Calling all_plugins_play to load vars for managed-node2 19285 1727203927.20420: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203927.20422: Calling groups_plugins_play to load vars for managed-node2 19285 1727203927.21550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203927.22688: done with get_vars() 19285 1727203927.22713: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:52:07 -0400 (0:00:00.623) 0:00:26.302 ***** 19285 1727203927.22774: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 19285 1727203927.23034: worker is 1 (out of 1 available) 19285 1727203927.23053: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 19285 1727203927.23068: done queuing things up, now waiting for results queue to drain 19285 1727203927.23069: waiting for pending results... 19285 1727203927.23269: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 19285 1727203927.23357: in run() - task 028d2410-947f-f31b-fb3f-00000000004b 19285 1727203927.23379: variable 'ansible_search_path' from source: unknown 19285 1727203927.23383: variable 'ansible_search_path' from source: unknown 19285 1727203927.23413: calling self._execute() 19285 1727203927.23489: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203927.23493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203927.23521: variable 'omit' from source: magic vars 19285 1727203927.23866: variable 'ansible_distribution_major_version' from source: facts 19285 1727203927.23980: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203927.24024: variable 'network_state' from source: role '' defaults 19285 1727203927.24043: Evaluated conditional (network_state != {}): False 19285 1727203927.24047: when evaluation is False, skipping this task 19285 1727203927.24052: _execute() done 19285 1727203927.24089: dumping result to json 19285 1727203927.24092: done dumping result, returning 19285 1727203927.24095: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [028d2410-947f-f31b-fb3f-00000000004b] 19285 1727203927.24100: sending task result for task 028d2410-947f-f31b-fb3f-00000000004b 19285 1727203927.24244: done sending task result for task 028d2410-947f-f31b-fb3f-00000000004b 19285 1727203927.24247: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19285 1727203927.24290: no more pending results, returning what we have 19285 1727203927.24294: results queue empty 19285 1727203927.24295: checking for any_errors_fatal 19285 1727203927.24303: done checking for any_errors_fatal 19285 1727203927.24304: checking for max_fail_percentage 19285 1727203927.24305: done checking for max_fail_percentage 19285 1727203927.24306: checking to see if all hosts have failed and the running result is not ok 19285 1727203927.24307: done checking to see if all hosts have failed 19285 1727203927.24307: getting the remaining hosts for this loop 19285 1727203927.24308: done getting the remaining hosts for this loop 19285 1727203927.24311: getting the next task for host managed-node2 19285 1727203927.24315: done getting next task for host managed-node2 19285 1727203927.24319: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 19285 1727203927.24321: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203927.24333: getting variables 19285 1727203927.24334: in VariableManager get_vars() 19285 1727203927.24362: Calling all_inventory to load vars for managed-node2 19285 1727203927.24365: Calling groups_inventory to load vars for managed-node2 19285 1727203927.24369: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203927.24382: Calling all_plugins_play to load vars for managed-node2 19285 1727203927.24388: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203927.24392: Calling groups_plugins_play to load vars for managed-node2 19285 1727203927.25260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203927.26182: done with get_vars() 19285 1727203927.26197: done getting variables 19285 1727203927.26239: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:52:07 -0400 (0:00:00.034) 0:00:26.337 ***** 19285 1727203927.26262: entering _queue_task() for managed-node2/debug 19285 1727203927.26483: worker is 1 (out of 1 available) 19285 1727203927.26496: exiting _queue_task() for managed-node2/debug 19285 1727203927.26507: done queuing things up, now waiting for results queue to drain 19285 1727203927.26509: waiting for pending results... 19285 1727203927.26711: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 19285 1727203927.26819: in run() - task 028d2410-947f-f31b-fb3f-00000000004c 19285 1727203927.26823: variable 'ansible_search_path' from source: unknown 19285 1727203927.26825: variable 'ansible_search_path' from source: unknown 19285 1727203927.26841: calling self._execute() 19285 1727203927.26924: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203927.26928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203927.26937: variable 'omit' from source: magic vars 19285 1727203927.27234: variable 'ansible_distribution_major_version' from source: facts 19285 1727203927.27244: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203927.27250: variable 'omit' from source: magic vars 19285 1727203927.27286: variable 'omit' from source: magic vars 19285 1727203927.27309: variable 'omit' from source: magic vars 19285 1727203927.27412: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203927.27416: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203927.27418: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203927.27441: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203927.27459: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203927.27518: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203927.27521: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203927.27523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203927.27586: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203927.27609: Set connection var ansible_pipelining to False 19285 1727203927.27613: Set connection var ansible_timeout to 10 19285 1727203927.27615: Set connection var ansible_shell_type to sh 19285 1727203927.27627: Set connection var ansible_shell_executable to /bin/sh 19285 1727203927.27630: Set connection var ansible_connection to ssh 19285 1727203927.27632: variable 'ansible_shell_executable' from source: unknown 19285 1727203927.27635: variable 'ansible_connection' from source: unknown 19285 1727203927.27638: variable 'ansible_module_compression' from source: unknown 19285 1727203927.27640: variable 'ansible_shell_type' from source: unknown 19285 1727203927.27642: variable 'ansible_shell_executable' from source: unknown 19285 1727203927.27644: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203927.27646: variable 'ansible_pipelining' from source: unknown 19285 1727203927.27648: variable 'ansible_timeout' from source: unknown 19285 1727203927.27771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203927.27870: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203927.27999: variable 'omit' from source: magic vars 19285 1727203927.28010: starting attempt loop 19285 1727203927.28017: running the handler 19285 1727203927.28163: variable '__network_connections_result' from source: set_fact 19285 1727203927.28231: handler run complete 19285 1727203927.28262: attempt loop complete, returning result 19285 1727203927.28281: _execute() done 19285 1727203927.28289: dumping result to json 19285 1727203927.28298: done dumping result, returning 19285 1727203927.28311: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [028d2410-947f-f31b-fb3f-00000000004c] 19285 1727203927.28323: sending task result for task 028d2410-947f-f31b-fb3f-00000000004c ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 19285 1727203927.28490: no more pending results, returning what we have 19285 1727203927.28496: results queue empty 19285 1727203927.28497: checking for any_errors_fatal 19285 1727203927.28504: done checking for any_errors_fatal 19285 1727203927.28505: checking for max_fail_percentage 19285 1727203927.28507: done checking for max_fail_percentage 19285 1727203927.28508: checking to see if all hosts have failed and the running result is not ok 19285 1727203927.28509: done checking to see if all hosts have failed 19285 1727203927.28509: getting the remaining hosts for this loop 19285 1727203927.28511: done getting the remaining hosts for this loop 19285 1727203927.28514: getting the next task for host managed-node2 19285 1727203927.28522: done getting next task for host managed-node2 19285 1727203927.28526: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 19285 1727203927.28528: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203927.28537: getting variables 19285 1727203927.28538: in VariableManager get_vars() 19285 1727203927.28574: Calling all_inventory to load vars for managed-node2 19285 1727203927.28753: Calling groups_inventory to load vars for managed-node2 19285 1727203927.28756: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203927.28785: done sending task result for task 028d2410-947f-f31b-fb3f-00000000004c 19285 1727203927.28788: WORKER PROCESS EXITING 19285 1727203927.28797: Calling all_plugins_play to load vars for managed-node2 19285 1727203927.28800: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203927.28804: Calling groups_plugins_play to load vars for managed-node2 19285 1727203927.31357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203927.33230: done with get_vars() 19285 1727203927.33248: done getting variables 19285 1727203927.33293: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:52:07 -0400 (0:00:00.070) 0:00:26.407 ***** 19285 1727203927.33314: entering _queue_task() for managed-node2/debug 19285 1727203927.33546: worker is 1 (out of 1 available) 19285 1727203927.33561: exiting _queue_task() for managed-node2/debug 19285 1727203927.33574: done queuing things up, now waiting for results queue to drain 19285 1727203927.33577: waiting for pending results... 19285 1727203927.33762: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 19285 1727203927.33884: in run() - task 028d2410-947f-f31b-fb3f-00000000004d 19285 1727203927.33907: variable 'ansible_search_path' from source: unknown 19285 1727203927.33912: variable 'ansible_search_path' from source: unknown 19285 1727203927.33960: calling self._execute() 19285 1727203927.34085: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203927.34089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203927.34102: variable 'omit' from source: magic vars 19285 1727203927.34389: variable 'ansible_distribution_major_version' from source: facts 19285 1727203927.34401: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203927.34406: variable 'omit' from source: magic vars 19285 1727203927.34435: variable 'omit' from source: magic vars 19285 1727203927.34459: variable 'omit' from source: magic vars 19285 1727203927.34496: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203927.34525: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203927.34545: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203927.34574: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203927.34616: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203927.34642: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203927.34646: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203927.34648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203927.34806: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203927.34813: Set connection var ansible_pipelining to False 19285 1727203927.34817: Set connection var ansible_timeout to 10 19285 1727203927.34819: Set connection var ansible_shell_type to sh 19285 1727203927.34821: Set connection var ansible_shell_executable to /bin/sh 19285 1727203927.34823: Set connection var ansible_connection to ssh 19285 1727203927.34849: variable 'ansible_shell_executable' from source: unknown 19285 1727203927.34866: variable 'ansible_connection' from source: unknown 19285 1727203927.34870: variable 'ansible_module_compression' from source: unknown 19285 1727203927.34900: variable 'ansible_shell_type' from source: unknown 19285 1727203927.34904: variable 'ansible_shell_executable' from source: unknown 19285 1727203927.34907: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203927.34909: variable 'ansible_pipelining' from source: unknown 19285 1727203927.34911: variable 'ansible_timeout' from source: unknown 19285 1727203927.34915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203927.35156: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203927.35162: variable 'omit' from source: magic vars 19285 1727203927.35165: starting attempt loop 19285 1727203927.35167: running the handler 19285 1727203927.35169: variable '__network_connections_result' from source: set_fact 19285 1727203927.35269: variable '__network_connections_result' from source: set_fact 19285 1727203927.35453: handler run complete 19285 1727203927.35459: attempt loop complete, returning result 19285 1727203927.35462: _execute() done 19285 1727203927.35465: dumping result to json 19285 1727203927.35467: done dumping result, returning 19285 1727203927.35469: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [028d2410-947f-f31b-fb3f-00000000004d] 19285 1727203927.35471: sending task result for task 028d2410-947f-f31b-fb3f-00000000004d 19285 1727203927.35548: done sending task result for task 028d2410-947f-f31b-fb3f-00000000004d 19285 1727203927.35551: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 19285 1727203927.35630: no more pending results, returning what we have 19285 1727203927.35633: results queue empty 19285 1727203927.35634: checking for any_errors_fatal 19285 1727203927.35644: done checking for any_errors_fatal 19285 1727203927.35645: checking for max_fail_percentage 19285 1727203927.35646: done checking for max_fail_percentage 19285 1727203927.35647: checking to see if all hosts have failed and the running result is not ok 19285 1727203927.35648: done checking to see if all hosts have failed 19285 1727203927.35649: getting the remaining hosts for this loop 19285 1727203927.35650: done getting the remaining hosts for this loop 19285 1727203927.35654: getting the next task for host managed-node2 19285 1727203927.35661: done getting next task for host managed-node2 19285 1727203927.35665: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 19285 1727203927.35668: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203927.35684: getting variables 19285 1727203927.35686: in VariableManager get_vars() 19285 1727203927.35719: Calling all_inventory to load vars for managed-node2 19285 1727203927.35722: Calling groups_inventory to load vars for managed-node2 19285 1727203927.35725: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203927.35734: Calling all_plugins_play to load vars for managed-node2 19285 1727203927.35739: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203927.35742: Calling groups_plugins_play to load vars for managed-node2 19285 1727203927.36795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203927.38526: done with get_vars() 19285 1727203927.38543: done getting variables 19285 1727203927.38590: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:52:07 -0400 (0:00:00.052) 0:00:26.460 ***** 19285 1727203927.38615: entering _queue_task() for managed-node2/debug 19285 1727203927.38838: worker is 1 (out of 1 available) 19285 1727203927.38853: exiting _queue_task() for managed-node2/debug 19285 1727203927.38868: done queuing things up, now waiting for results queue to drain 19285 1727203927.38869: waiting for pending results... 19285 1727203927.39036: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 19285 1727203927.39105: in run() - task 028d2410-947f-f31b-fb3f-00000000004e 19285 1727203927.39117: variable 'ansible_search_path' from source: unknown 19285 1727203927.39120: variable 'ansible_search_path' from source: unknown 19285 1727203927.39148: calling self._execute() 19285 1727203927.39219: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203927.39223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203927.39233: variable 'omit' from source: magic vars 19285 1727203927.39495: variable 'ansible_distribution_major_version' from source: facts 19285 1727203927.39505: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203927.39590: variable 'network_state' from source: role '' defaults 19285 1727203927.39599: Evaluated conditional (network_state != {}): False 19285 1727203927.39602: when evaluation is False, skipping this task 19285 1727203927.39605: _execute() done 19285 1727203927.39607: dumping result to json 19285 1727203927.39609: done dumping result, returning 19285 1727203927.39617: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [028d2410-947f-f31b-fb3f-00000000004e] 19285 1727203927.39622: sending task result for task 028d2410-947f-f31b-fb3f-00000000004e 19285 1727203927.39704: done sending task result for task 028d2410-947f-f31b-fb3f-00000000004e 19285 1727203927.39707: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 19285 1727203927.39790: no more pending results, returning what we have 19285 1727203927.39794: results queue empty 19285 1727203927.39794: checking for any_errors_fatal 19285 1727203927.39800: done checking for any_errors_fatal 19285 1727203927.39801: checking for max_fail_percentage 19285 1727203927.39802: done checking for max_fail_percentage 19285 1727203927.39803: checking to see if all hosts have failed and the running result is not ok 19285 1727203927.39804: done checking to see if all hosts have failed 19285 1727203927.39805: getting the remaining hosts for this loop 19285 1727203927.39806: done getting the remaining hosts for this loop 19285 1727203927.39809: getting the next task for host managed-node2 19285 1727203927.39813: done getting next task for host managed-node2 19285 1727203927.39816: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 19285 1727203927.39818: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203927.39830: getting variables 19285 1727203927.39831: in VariableManager get_vars() 19285 1727203927.39862: Calling all_inventory to load vars for managed-node2 19285 1727203927.39864: Calling groups_inventory to load vars for managed-node2 19285 1727203927.39867: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203927.39874: Calling all_plugins_play to load vars for managed-node2 19285 1727203927.39879: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203927.39881: Calling groups_plugins_play to load vars for managed-node2 19285 1727203927.41092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203927.42582: done with get_vars() 19285 1727203927.42604: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:52:07 -0400 (0:00:00.040) 0:00:26.501 ***** 19285 1727203927.42695: entering _queue_task() for managed-node2/ping 19285 1727203927.42976: worker is 1 (out of 1 available) 19285 1727203927.42990: exiting _queue_task() for managed-node2/ping 19285 1727203927.43002: done queuing things up, now waiting for results queue to drain 19285 1727203927.43003: waiting for pending results... 19285 1727203927.43294: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 19285 1727203927.43351: in run() - task 028d2410-947f-f31b-fb3f-00000000004f 19285 1727203927.43364: variable 'ansible_search_path' from source: unknown 19285 1727203927.43368: variable 'ansible_search_path' from source: unknown 19285 1727203927.43407: calling self._execute() 19285 1727203927.43498: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203927.43503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203927.43513: variable 'omit' from source: magic vars 19285 1727203927.43873: variable 'ansible_distribution_major_version' from source: facts 19285 1727203927.43891: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203927.43894: variable 'omit' from source: magic vars 19285 1727203927.43935: variable 'omit' from source: magic vars 19285 1727203927.43966: variable 'omit' from source: magic vars 19285 1727203927.44108: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203927.44112: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203927.44115: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203927.44117: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203927.44120: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203927.44122: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203927.44124: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203927.44126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203927.44231: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203927.44239: Set connection var ansible_pipelining to False 19285 1727203927.44245: Set connection var ansible_timeout to 10 19285 1727203927.44262: Set connection var ansible_shell_type to sh 19285 1727203927.44269: Set connection var ansible_shell_executable to /bin/sh 19285 1727203927.44272: Set connection var ansible_connection to ssh 19285 1727203927.44295: variable 'ansible_shell_executable' from source: unknown 19285 1727203927.44298: variable 'ansible_connection' from source: unknown 19285 1727203927.44301: variable 'ansible_module_compression' from source: unknown 19285 1727203927.44303: variable 'ansible_shell_type' from source: unknown 19285 1727203927.44305: variable 'ansible_shell_executable' from source: unknown 19285 1727203927.44308: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203927.44312: variable 'ansible_pipelining' from source: unknown 19285 1727203927.44314: variable 'ansible_timeout' from source: unknown 19285 1727203927.44323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203927.44517: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19285 1727203927.44528: variable 'omit' from source: magic vars 19285 1727203927.44534: starting attempt loop 19285 1727203927.44654: running the handler 19285 1727203927.44660: _low_level_execute_command(): starting 19285 1727203927.44664: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203927.45254: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203927.45267: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203927.45284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203927.45297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203927.45311: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203927.45328: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203927.45415: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203927.45435: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203927.45546: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203927.47240: stdout chunk (state=3): >>>/root <<< 19285 1727203927.47520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203927.47523: stdout chunk (state=3): >>><<< 19285 1727203927.47525: stderr chunk (state=3): >>><<< 19285 1727203927.47640: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203927.47644: _low_level_execute_command(): starting 19285 1727203927.47647: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203927.4754949-21378-28844694456623 `" && echo ansible-tmp-1727203927.4754949-21378-28844694456623="` echo /root/.ansible/tmp/ansible-tmp-1727203927.4754949-21378-28844694456623 `" ) && sleep 0' 19285 1727203927.48903: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203927.48943: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203927.48963: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203927.49111: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203927.49270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203927.51374: stdout chunk (state=3): >>>ansible-tmp-1727203927.4754949-21378-28844694456623=/root/.ansible/tmp/ansible-tmp-1727203927.4754949-21378-28844694456623 <<< 19285 1727203927.51582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203927.51586: stdout chunk (state=3): >>><<< 19285 1727203927.51588: stderr chunk (state=3): >>><<< 19285 1727203927.51590: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203927.4754949-21378-28844694456623=/root/.ansible/tmp/ansible-tmp-1727203927.4754949-21378-28844694456623 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203927.51593: variable 'ansible_module_compression' from source: unknown 19285 1727203927.51595: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 19285 1727203927.51597: variable 'ansible_facts' from source: unknown 19285 1727203927.51838: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203927.4754949-21378-28844694456623/AnsiballZ_ping.py 19285 1727203927.52220: Sending initial data 19285 1727203927.52223: Sent initial data (152 bytes) 19285 1727203927.53103: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203927.53119: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203927.53136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203927.53193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203927.53259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203927.53286: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203927.53304: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203927.53410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203927.55040: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203927.55129: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203927.55226: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmpu89x0p7i /root/.ansible/tmp/ansible-tmp-1727203927.4754949-21378-28844694456623/AnsiballZ_ping.py <<< 19285 1727203927.55229: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203927.4754949-21378-28844694456623/AnsiballZ_ping.py" <<< 19285 1727203927.55509: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmpu89x0p7i" to remote "/root/.ansible/tmp/ansible-tmp-1727203927.4754949-21378-28844694456623/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203927.4754949-21378-28844694456623/AnsiballZ_ping.py" <<< 19285 1727203927.56945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203927.56960: stdout chunk (state=3): >>><<< 19285 1727203927.57018: stderr chunk (state=3): >>><<< 19285 1727203927.57046: done transferring module to remote 19285 1727203927.57069: _low_level_execute_command(): starting 19285 1727203927.57085: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203927.4754949-21378-28844694456623/ /root/.ansible/tmp/ansible-tmp-1727203927.4754949-21378-28844694456623/AnsiballZ_ping.py && sleep 0' 19285 1727203927.57924: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203927.57945: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203927.58182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203927.58309: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203927.58383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203927.60289: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203927.60303: stdout chunk (state=3): >>><<< 19285 1727203927.60313: stderr chunk (state=3): >>><<< 19285 1727203927.60332: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203927.60340: _low_level_execute_command(): starting 19285 1727203927.60349: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203927.4754949-21378-28844694456623/AnsiballZ_ping.py && sleep 0' 19285 1727203927.61014: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203927.61034: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203927.61090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203927.61164: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203927.61182: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203927.61260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203927.61366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203927.76507: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 19285 1727203927.77745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203927.77808: stderr chunk (state=3): >>>Shared connection to 10.31.13.254 closed. <<< 19285 1727203927.77819: stdout chunk (state=3): >>><<< 19285 1727203927.77839: stderr chunk (state=3): >>><<< 19285 1727203927.77972: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203927.77978: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203927.4754949-21378-28844694456623/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203927.77980: _low_level_execute_command(): starting 19285 1727203927.77983: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203927.4754949-21378-28844694456623/ > /dev/null 2>&1 && sleep 0' 19285 1727203927.78550: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203927.78588: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203927.78602: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19285 1727203927.78613: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 19285 1727203927.78652: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203927.78719: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203927.78737: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203927.78795: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203927.78881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203927.80792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203927.80796: stdout chunk (state=3): >>><<< 19285 1727203927.80798: stderr chunk (state=3): >>><<< 19285 1727203927.80822: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203927.80913: handler run complete 19285 1727203927.80917: attempt loop complete, returning result 19285 1727203927.80919: _execute() done 19285 1727203927.80922: dumping result to json 19285 1727203927.80924: done dumping result, returning 19285 1727203927.80926: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [028d2410-947f-f31b-fb3f-00000000004f] 19285 1727203927.80928: sending task result for task 028d2410-947f-f31b-fb3f-00000000004f ok: [managed-node2] => { "changed": false, "ping": "pong" } 19285 1727203927.81139: no more pending results, returning what we have 19285 1727203927.81143: results queue empty 19285 1727203927.81144: checking for any_errors_fatal 19285 1727203927.81151: done checking for any_errors_fatal 19285 1727203927.81152: checking for max_fail_percentage 19285 1727203927.81154: done checking for max_fail_percentage 19285 1727203927.81155: checking to see if all hosts have failed and the running result is not ok 19285 1727203927.81156: done checking to see if all hosts have failed 19285 1727203927.81157: getting the remaining hosts for this loop 19285 1727203927.81162: done getting the remaining hosts for this loop 19285 1727203927.81165: getting the next task for host managed-node2 19285 1727203927.81173: done getting next task for host managed-node2 19285 1727203927.81178: ^ task is: TASK: meta (role_complete) 19285 1727203927.81180: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203927.81191: getting variables 19285 1727203927.81193: in VariableManager get_vars() 19285 1727203927.81233: Calling all_inventory to load vars for managed-node2 19285 1727203927.81235: Calling groups_inventory to load vars for managed-node2 19285 1727203927.81239: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203927.81249: Calling all_plugins_play to load vars for managed-node2 19285 1727203927.81253: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203927.81256: Calling groups_plugins_play to load vars for managed-node2 19285 1727203927.81791: done sending task result for task 028d2410-947f-f31b-fb3f-00000000004f 19285 1727203927.81795: WORKER PROCESS EXITING 19285 1727203927.84395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203927.86639: done with get_vars() 19285 1727203927.86667: done getting variables 19285 1727203927.86901: done queuing things up, now waiting for results queue to drain 19285 1727203927.86903: results queue empty 19285 1727203927.86904: checking for any_errors_fatal 19285 1727203927.86907: done checking for any_errors_fatal 19285 1727203927.86908: checking for max_fail_percentage 19285 1727203927.86909: done checking for max_fail_percentage 19285 1727203927.86909: checking to see if all hosts have failed and the running result is not ok 19285 1727203927.86910: done checking to see if all hosts have failed 19285 1727203927.86911: getting the remaining hosts for this loop 19285 1727203927.86912: done getting the remaining hosts for this loop 19285 1727203927.86915: getting the next task for host managed-node2 19285 1727203927.86918: done getting next task for host managed-node2 19285 1727203927.86920: ^ task is: TASK: meta (flush_handlers) 19285 1727203927.86921: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203927.86924: getting variables 19285 1727203927.86925: in VariableManager get_vars() 19285 1727203927.86939: Calling all_inventory to load vars for managed-node2 19285 1727203927.86941: Calling groups_inventory to load vars for managed-node2 19285 1727203927.86943: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203927.86948: Calling all_plugins_play to load vars for managed-node2 19285 1727203927.86951: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203927.86954: Calling groups_plugins_play to load vars for managed-node2 19285 1727203927.88126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203927.90086: done with get_vars() 19285 1727203927.90135: done getting variables 19285 1727203927.90232: in VariableManager get_vars() 19285 1727203927.90246: Calling all_inventory to load vars for managed-node2 19285 1727203927.90248: Calling groups_inventory to load vars for managed-node2 19285 1727203927.90256: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203927.90264: Calling all_plugins_play to load vars for managed-node2 19285 1727203927.90266: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203927.90269: Calling groups_plugins_play to load vars for managed-node2 19285 1727203927.91800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203927.94947: done with get_vars() 19285 1727203927.95266: done queuing things up, now waiting for results queue to drain 19285 1727203927.95269: results queue empty 19285 1727203927.95270: checking for any_errors_fatal 19285 1727203927.95271: done checking for any_errors_fatal 19285 1727203927.95271: checking for max_fail_percentage 19285 1727203927.95272: done checking for max_fail_percentage 19285 1727203927.95273: checking to see if all hosts have failed and the running result is not ok 19285 1727203927.95274: done checking to see if all hosts have failed 19285 1727203927.95275: getting the remaining hosts for this loop 19285 1727203927.95277: done getting the remaining hosts for this loop 19285 1727203927.95280: getting the next task for host managed-node2 19285 1727203927.95283: done getting next task for host managed-node2 19285 1727203927.95285: ^ task is: TASK: meta (flush_handlers) 19285 1727203927.95286: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203927.95289: getting variables 19285 1727203927.95290: in VariableManager get_vars() 19285 1727203927.95352: Calling all_inventory to load vars for managed-node2 19285 1727203927.95354: Calling groups_inventory to load vars for managed-node2 19285 1727203927.95356: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203927.95402: Calling all_plugins_play to load vars for managed-node2 19285 1727203927.95420: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203927.95424: Calling groups_plugins_play to load vars for managed-node2 19285 1727203927.97783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203927.99578: done with get_vars() 19285 1727203927.99603: done getting variables 19285 1727203927.99663: in VariableManager get_vars() 19285 1727203927.99678: Calling all_inventory to load vars for managed-node2 19285 1727203927.99680: Calling groups_inventory to load vars for managed-node2 19285 1727203927.99682: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203927.99687: Calling all_plugins_play to load vars for managed-node2 19285 1727203927.99690: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203927.99692: Calling groups_plugins_play to load vars for managed-node2 19285 1727203928.01646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203928.04172: done with get_vars() 19285 1727203928.04206: done queuing things up, now waiting for results queue to drain 19285 1727203928.04208: results queue empty 19285 1727203928.04305: checking for any_errors_fatal 19285 1727203928.04307: done checking for any_errors_fatal 19285 1727203928.04308: checking for max_fail_percentage 19285 1727203928.04310: done checking for max_fail_percentage 19285 1727203928.04310: checking to see if all hosts have failed and the running result is not ok 19285 1727203928.04311: done checking to see if all hosts have failed 19285 1727203928.04312: getting the remaining hosts for this loop 19285 1727203928.04313: done getting the remaining hosts for this loop 19285 1727203928.04316: getting the next task for host managed-node2 19285 1727203928.04351: done getting next task for host managed-node2 19285 1727203928.04352: ^ task is: None 19285 1727203928.04354: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203928.04355: done queuing things up, now waiting for results queue to drain 19285 1727203928.04356: results queue empty 19285 1727203928.04357: checking for any_errors_fatal 19285 1727203928.04360: done checking for any_errors_fatal 19285 1727203928.04361: checking for max_fail_percentage 19285 1727203928.04362: done checking for max_fail_percentage 19285 1727203928.04363: checking to see if all hosts have failed and the running result is not ok 19285 1727203928.04363: done checking to see if all hosts have failed 19285 1727203928.04365: getting the next task for host managed-node2 19285 1727203928.04367: done getting next task for host managed-node2 19285 1727203928.04368: ^ task is: None 19285 1727203928.04369: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203928.04560: in VariableManager get_vars() 19285 1727203928.04580: done with get_vars() 19285 1727203928.04586: in VariableManager get_vars() 19285 1727203928.04596: done with get_vars() 19285 1727203928.04600: variable 'omit' from source: magic vars 19285 1727203928.04630: in VariableManager get_vars() 19285 1727203928.04639: done with get_vars() 19285 1727203928.04714: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 19285 1727203928.04967: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19285 1727203928.04994: getting the remaining hosts for this loop 19285 1727203928.04999: done getting the remaining hosts for this loop 19285 1727203928.05002: getting the next task for host managed-node2 19285 1727203928.05004: done getting next task for host managed-node2 19285 1727203928.05006: ^ task is: TASK: Gathering Facts 19285 1727203928.05008: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203928.05010: getting variables 19285 1727203928.05011: in VariableManager get_vars() 19285 1727203928.05019: Calling all_inventory to load vars for managed-node2 19285 1727203928.05021: Calling groups_inventory to load vars for managed-node2 19285 1727203928.05023: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203928.05029: Calling all_plugins_play to load vars for managed-node2 19285 1727203928.05031: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203928.05034: Calling groups_plugins_play to load vars for managed-node2 19285 1727203928.06652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203928.17074: done with get_vars() 19285 1727203928.17102: done getting variables 19285 1727203928.17155: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Tuesday 24 September 2024 14:52:08 -0400 (0:00:00.744) 0:00:27.246 ***** 19285 1727203928.17191: entering _queue_task() for managed-node2/gather_facts 19285 1727203928.17712: worker is 1 (out of 1 available) 19285 1727203928.17730: exiting _queue_task() for managed-node2/gather_facts 19285 1727203928.17770: done queuing things up, now waiting for results queue to drain 19285 1727203928.17773: waiting for pending results... 19285 1727203928.18092: running TaskExecutor() for managed-node2/TASK: Gathering Facts 19285 1727203928.18098: in run() - task 028d2410-947f-f31b-fb3f-000000000382 19285 1727203928.18101: variable 'ansible_search_path' from source: unknown 19285 1727203928.18140: calling self._execute() 19285 1727203928.18252: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203928.18266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203928.18285: variable 'omit' from source: magic vars 19285 1727203928.18682: variable 'ansible_distribution_major_version' from source: facts 19285 1727203928.18701: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203928.18715: variable 'omit' from source: magic vars 19285 1727203928.18744: variable 'omit' from source: magic vars 19285 1727203928.18787: variable 'omit' from source: magic vars 19285 1727203928.18831: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203928.18877: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203928.18910: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203928.18977: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203928.18980: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203928.19002: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203928.19014: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203928.19024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203928.19138: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203928.19152: Set connection var ansible_pipelining to False 19285 1727203928.19191: Set connection var ansible_timeout to 10 19285 1727203928.19194: Set connection var ansible_shell_type to sh 19285 1727203928.19198: Set connection var ansible_shell_executable to /bin/sh 19285 1727203928.19200: Set connection var ansible_connection to ssh 19285 1727203928.19218: variable 'ansible_shell_executable' from source: unknown 19285 1727203928.19227: variable 'ansible_connection' from source: unknown 19285 1727203928.19280: variable 'ansible_module_compression' from source: unknown 19285 1727203928.19284: variable 'ansible_shell_type' from source: unknown 19285 1727203928.19287: variable 'ansible_shell_executable' from source: unknown 19285 1727203928.19289: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203928.19290: variable 'ansible_pipelining' from source: unknown 19285 1727203928.19295: variable 'ansible_timeout' from source: unknown 19285 1727203928.19300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203928.19469: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203928.19488: variable 'omit' from source: magic vars 19285 1727203928.19500: starting attempt loop 19285 1727203928.19507: running the handler 19285 1727203928.19533: variable 'ansible_facts' from source: unknown 19285 1727203928.19581: _low_level_execute_command(): starting 19285 1727203928.19585: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203928.20410: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203928.20431: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203928.20449: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203928.20473: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203928.20605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203928.22368: stdout chunk (state=3): >>>/root <<< 19285 1727203928.22510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203928.22514: stdout chunk (state=3): >>><<< 19285 1727203928.22516: stderr chunk (state=3): >>><<< 19285 1727203928.22538: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203928.22558: _low_level_execute_command(): starting 19285 1727203928.22644: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203928.2254431-21405-56243810527464 `" && echo ansible-tmp-1727203928.2254431-21405-56243810527464="` echo /root/.ansible/tmp/ansible-tmp-1727203928.2254431-21405-56243810527464 `" ) && sleep 0' 19285 1727203928.23173: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203928.23284: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203928.23298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203928.23320: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203928.23418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203928.25370: stdout chunk (state=3): >>>ansible-tmp-1727203928.2254431-21405-56243810527464=/root/.ansible/tmp/ansible-tmp-1727203928.2254431-21405-56243810527464 <<< 19285 1727203928.25516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203928.25528: stdout chunk (state=3): >>><<< 19285 1727203928.25546: stderr chunk (state=3): >>><<< 19285 1727203928.25570: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203928.2254431-21405-56243810527464=/root/.ansible/tmp/ansible-tmp-1727203928.2254431-21405-56243810527464 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203928.25612: variable 'ansible_module_compression' from source: unknown 19285 1727203928.25680: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19285 1727203928.25778: variable 'ansible_facts' from source: unknown 19285 1727203928.25969: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203928.2254431-21405-56243810527464/AnsiballZ_setup.py 19285 1727203928.26137: Sending initial data 19285 1727203928.26140: Sent initial data (153 bytes) 19285 1727203928.26866: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203928.26886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203928.26912: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203928.26937: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203928.27035: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203928.28696: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203928.28763: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203928.28880: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmpum6u709k /root/.ansible/tmp/ansible-tmp-1727203928.2254431-21405-56243810527464/AnsiballZ_setup.py <<< 19285 1727203928.28884: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203928.2254431-21405-56243810527464/AnsiballZ_setup.py" <<< 19285 1727203928.28963: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmpum6u709k" to remote "/root/.ansible/tmp/ansible-tmp-1727203928.2254431-21405-56243810527464/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203928.2254431-21405-56243810527464/AnsiballZ_setup.py" <<< 19285 1727203928.30251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203928.30427: stderr chunk (state=3): >>><<< 19285 1727203928.30431: stdout chunk (state=3): >>><<< 19285 1727203928.30434: done transferring module to remote 19285 1727203928.30436: _low_level_execute_command(): starting 19285 1727203928.30439: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203928.2254431-21405-56243810527464/ /root/.ansible/tmp/ansible-tmp-1727203928.2254431-21405-56243810527464/AnsiballZ_setup.py && sleep 0' 19285 1727203928.31094: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203928.31113: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203928.31131: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203928.31157: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203928.31253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203928.33057: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203928.33082: stderr chunk (state=3): >>><<< 19285 1727203928.33085: stdout chunk (state=3): >>><<< 19285 1727203928.33098: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203928.33108: _low_level_execute_command(): starting 19285 1727203928.33115: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203928.2254431-21405-56243810527464/AnsiballZ_setup.py && sleep 0' 19285 1727203928.33696: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203928.33712: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203928.33735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203928.33752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203928.33768: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203928.33847: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203928.33886: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203928.33900: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203928.34011: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203928.34104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203928.97689: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "52", "second": "08", "epoch": "1727203928", "epoch_int": "1727203928", "date": "2024-09-24", "time": "14:52:08", "iso8601_micro": "2024-09-24T18:52:08.611605Z", "iso8601": "2024-09-24T18:52:08Z", "iso8601_basic": "20240924T145208611605", "iso8601_basic_short": "20240924T145208", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.6142578125, "5m": 0.42041015625, "15m": 0.20849609375}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2935, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 596, "free": 2935}, "nocache": {"free": 3292, "used": 239}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 514, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261787811840, "block_size": 4096, "block_total": 65519099, "block_available": 63913040, "block_used": 1606059, "inode_total": 131070960, "inode_available": 131027263, "inode_used": 43697, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19285 1727203928.99188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203928.99201: stdout chunk (state=3): >>><<< 19285 1727203928.99403: stderr chunk (state=3): >>><<< 19285 1727203928.99582: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "52", "second": "08", "epoch": "1727203928", "epoch_int": "1727203928", "date": "2024-09-24", "time": "14:52:08", "iso8601_micro": "2024-09-24T18:52:08.611605Z", "iso8601": "2024-09-24T18:52:08Z", "iso8601_basic": "20240924T145208611605", "iso8601_basic_short": "20240924T145208", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.6142578125, "5m": 0.42041015625, "15m": 0.20849609375}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2935, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 596, "free": 2935}, "nocache": {"free": 3292, "used": 239}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 514, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261787811840, "block_size": 4096, "block_total": 65519099, "block_available": 63913040, "block_used": 1606059, "inode_total": 131070960, "inode_available": 131027263, "inode_used": 43697, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203929.00299: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203928.2254431-21405-56243810527464/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203929.00310: _low_level_execute_command(): starting 19285 1727203929.00314: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203928.2254431-21405-56243810527464/ > /dev/null 2>&1 && sleep 0' 19285 1727203929.01348: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203929.01698: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203929.01717: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203929.01752: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203929.01856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203929.03736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203929.03870: stderr chunk (state=3): >>><<< 19285 1727203929.03883: stdout chunk (state=3): >>><<< 19285 1727203929.03905: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203929.03919: handler run complete 19285 1727203929.04130: variable 'ansible_facts' from source: unknown 19285 1727203929.04433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203929.05031: variable 'ansible_facts' from source: unknown 19285 1727203929.05182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203929.05493: attempt loop complete, returning result 19285 1727203929.05543: _execute() done 19285 1727203929.05551: dumping result to json 19285 1727203929.05674: done dumping result, returning 19285 1727203929.05699: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [028d2410-947f-f31b-fb3f-000000000382] 19285 1727203929.05714: sending task result for task 028d2410-947f-f31b-fb3f-000000000382 19285 1727203929.06523: done sending task result for task 028d2410-947f-f31b-fb3f-000000000382 19285 1727203929.06527: WORKER PROCESS EXITING ok: [managed-node2] 19285 1727203929.07107: no more pending results, returning what we have 19285 1727203929.07111: results queue empty 19285 1727203929.07112: checking for any_errors_fatal 19285 1727203929.07113: done checking for any_errors_fatal 19285 1727203929.07113: checking for max_fail_percentage 19285 1727203929.07115: done checking for max_fail_percentage 19285 1727203929.07116: checking to see if all hosts have failed and the running result is not ok 19285 1727203929.07117: done checking to see if all hosts have failed 19285 1727203929.07118: getting the remaining hosts for this loop 19285 1727203929.07120: done getting the remaining hosts for this loop 19285 1727203929.07124: getting the next task for host managed-node2 19285 1727203929.07129: done getting next task for host managed-node2 19285 1727203929.07131: ^ task is: TASK: meta (flush_handlers) 19285 1727203929.07133: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203929.07137: getting variables 19285 1727203929.07138: in VariableManager get_vars() 19285 1727203929.07162: Calling all_inventory to load vars for managed-node2 19285 1727203929.07165: Calling groups_inventory to load vars for managed-node2 19285 1727203929.07169: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203929.07465: Calling all_plugins_play to load vars for managed-node2 19285 1727203929.07469: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203929.07473: Calling groups_plugins_play to load vars for managed-node2 19285 1727203929.10226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203929.14465: done with get_vars() 19285 1727203929.14735: done getting variables 19285 1727203929.15046: in VariableManager get_vars() 19285 1727203929.15059: Calling all_inventory to load vars for managed-node2 19285 1727203929.15061: Calling groups_inventory to load vars for managed-node2 19285 1727203929.15063: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203929.15068: Calling all_plugins_play to load vars for managed-node2 19285 1727203929.15070: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203929.15072: Calling groups_plugins_play to load vars for managed-node2 19285 1727203929.18320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203929.21363: done with get_vars() 19285 1727203929.21508: done queuing things up, now waiting for results queue to drain 19285 1727203929.21511: results queue empty 19285 1727203929.21511: checking for any_errors_fatal 19285 1727203929.21516: done checking for any_errors_fatal 19285 1727203929.21517: checking for max_fail_percentage 19285 1727203929.21518: done checking for max_fail_percentage 19285 1727203929.21518: checking to see if all hosts have failed and the running result is not ok 19285 1727203929.21519: done checking to see if all hosts have failed 19285 1727203929.21524: getting the remaining hosts for this loop 19285 1727203929.21525: done getting the remaining hosts for this loop 19285 1727203929.21528: getting the next task for host managed-node2 19285 1727203929.21532: done getting next task for host managed-node2 19285 1727203929.21535: ^ task is: TASK: Include the task 'delete_interface.yml' 19285 1727203929.21537: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203929.21539: getting variables 19285 1727203929.21540: in VariableManager get_vars() 19285 1727203929.21550: Calling all_inventory to load vars for managed-node2 19285 1727203929.21553: Calling groups_inventory to load vars for managed-node2 19285 1727203929.21555: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203929.21564: Calling all_plugins_play to load vars for managed-node2 19285 1727203929.21567: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203929.21570: Calling groups_plugins_play to load vars for managed-node2 19285 1727203929.24194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203929.27554: done with get_vars() 19285 1727203929.27577: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Tuesday 24 September 2024 14:52:09 -0400 (0:00:01.105) 0:00:28.352 ***** 19285 1727203929.27771: entering _queue_task() for managed-node2/include_tasks 19285 1727203929.28758: worker is 1 (out of 1 available) 19285 1727203929.28770: exiting _queue_task() for managed-node2/include_tasks 19285 1727203929.28910: done queuing things up, now waiting for results queue to drain 19285 1727203929.28913: waiting for pending results... 19285 1727203929.29794: running TaskExecutor() for managed-node2/TASK: Include the task 'delete_interface.yml' 19285 1727203929.29799: in run() - task 028d2410-947f-f31b-fb3f-000000000052 19285 1727203929.29802: variable 'ansible_search_path' from source: unknown 19285 1727203929.29806: calling self._execute() 19285 1727203929.30182: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203929.30186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203929.30189: variable 'omit' from source: magic vars 19285 1727203929.30827: variable 'ansible_distribution_major_version' from source: facts 19285 1727203929.30839: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203929.30846: _execute() done 19285 1727203929.30849: dumping result to json 19285 1727203929.30852: done dumping result, returning 19285 1727203929.30973: done running TaskExecutor() for managed-node2/TASK: Include the task 'delete_interface.yml' [028d2410-947f-f31b-fb3f-000000000052] 19285 1727203929.31181: sending task result for task 028d2410-947f-f31b-fb3f-000000000052 19285 1727203929.31262: done sending task result for task 028d2410-947f-f31b-fb3f-000000000052 19285 1727203929.31266: WORKER PROCESS EXITING 19285 1727203929.31297: no more pending results, returning what we have 19285 1727203929.31303: in VariableManager get_vars() 19285 1727203929.31338: Calling all_inventory to load vars for managed-node2 19285 1727203929.31341: Calling groups_inventory to load vars for managed-node2 19285 1727203929.31345: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203929.31360: Calling all_plugins_play to load vars for managed-node2 19285 1727203929.31363: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203929.31367: Calling groups_plugins_play to load vars for managed-node2 19285 1727203929.35708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203929.39737: done with get_vars() 19285 1727203929.39766: variable 'ansible_search_path' from source: unknown 19285 1727203929.39887: we have included files to process 19285 1727203929.39889: generating all_blocks data 19285 1727203929.39890: done generating all_blocks data 19285 1727203929.39891: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 19285 1727203929.39892: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 19285 1727203929.39895: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 19285 1727203929.40329: done processing included file 19285 1727203929.40331: iterating over new_blocks loaded from include file 19285 1727203929.40333: in VariableManager get_vars() 19285 1727203929.40463: done with get_vars() 19285 1727203929.40465: filtering new block on tags 19285 1727203929.40485: done filtering new block on tags 19285 1727203929.40488: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed-node2 19285 1727203929.40493: extending task lists for all hosts with included blocks 19285 1727203929.40525: done extending task lists 19285 1727203929.40526: done processing included files 19285 1727203929.40527: results queue empty 19285 1727203929.40527: checking for any_errors_fatal 19285 1727203929.40529: done checking for any_errors_fatal 19285 1727203929.40530: checking for max_fail_percentage 19285 1727203929.40531: done checking for max_fail_percentage 19285 1727203929.40531: checking to see if all hosts have failed and the running result is not ok 19285 1727203929.40532: done checking to see if all hosts have failed 19285 1727203929.40533: getting the remaining hosts for this loop 19285 1727203929.40534: done getting the remaining hosts for this loop 19285 1727203929.40536: getting the next task for host managed-node2 19285 1727203929.40540: done getting next task for host managed-node2 19285 1727203929.40543: ^ task is: TASK: Remove test interface if necessary 19285 1727203929.40545: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203929.40547: getting variables 19285 1727203929.40548: in VariableManager get_vars() 19285 1727203929.40560: Calling all_inventory to load vars for managed-node2 19285 1727203929.40680: Calling groups_inventory to load vars for managed-node2 19285 1727203929.40685: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203929.40694: Calling all_plugins_play to load vars for managed-node2 19285 1727203929.40697: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203929.40700: Calling groups_plugins_play to load vars for managed-node2 19285 1727203929.43368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203929.46646: done with get_vars() 19285 1727203929.46679: done getting variables 19285 1727203929.46845: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Tuesday 24 September 2024 14:52:09 -0400 (0:00:00.191) 0:00:28.543 ***** 19285 1727203929.46945: entering _queue_task() for managed-node2/command 19285 1727203929.47843: worker is 1 (out of 1 available) 19285 1727203929.47856: exiting _queue_task() for managed-node2/command 19285 1727203929.47869: done queuing things up, now waiting for results queue to drain 19285 1727203929.47871: waiting for pending results... 19285 1727203929.48248: running TaskExecutor() for managed-node2/TASK: Remove test interface if necessary 19285 1727203929.48693: in run() - task 028d2410-947f-f31b-fb3f-000000000393 19285 1727203929.48698: variable 'ansible_search_path' from source: unknown 19285 1727203929.48700: variable 'ansible_search_path' from source: unknown 19285 1727203929.48703: calling self._execute() 19285 1727203929.48856: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203929.48872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203929.48890: variable 'omit' from source: magic vars 19285 1727203929.49662: variable 'ansible_distribution_major_version' from source: facts 19285 1727203929.49722: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203929.49735: variable 'omit' from source: magic vars 19285 1727203929.49882: variable 'omit' from source: magic vars 19285 1727203929.49993: variable 'interface' from source: set_fact 19285 1727203929.50181: variable 'omit' from source: magic vars 19285 1727203929.50216: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203929.50268: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203929.50472: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203929.50478: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203929.50481: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203929.50483: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203929.50485: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203929.50488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203929.50678: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203929.50880: Set connection var ansible_pipelining to False 19285 1727203929.50884: Set connection var ansible_timeout to 10 19285 1727203929.50886: Set connection var ansible_shell_type to sh 19285 1727203929.50888: Set connection var ansible_shell_executable to /bin/sh 19285 1727203929.50891: Set connection var ansible_connection to ssh 19285 1727203929.50893: variable 'ansible_shell_executable' from source: unknown 19285 1727203929.50896: variable 'ansible_connection' from source: unknown 19285 1727203929.50898: variable 'ansible_module_compression' from source: unknown 19285 1727203929.50902: variable 'ansible_shell_type' from source: unknown 19285 1727203929.50904: variable 'ansible_shell_executable' from source: unknown 19285 1727203929.50906: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203929.50908: variable 'ansible_pipelining' from source: unknown 19285 1727203929.50910: variable 'ansible_timeout' from source: unknown 19285 1727203929.50912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203929.51356: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203929.51362: variable 'omit' from source: magic vars 19285 1727203929.51365: starting attempt loop 19285 1727203929.51368: running the handler 19285 1727203929.51370: _low_level_execute_command(): starting 19285 1727203929.51372: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203929.52822: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 19285 1727203929.52844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203929.52992: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203929.53012: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203929.53161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203929.53279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203929.54982: stdout chunk (state=3): >>>/root <<< 19285 1727203929.55181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203929.55185: stdout chunk (state=3): >>><<< 19285 1727203929.55187: stderr chunk (state=3): >>><<< 19285 1727203929.55217: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203929.55237: _low_level_execute_command(): starting 19285 1727203929.55247: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203929.5522387-21580-196662713024490 `" && echo ansible-tmp-1727203929.5522387-21580-196662713024490="` echo /root/.ansible/tmp/ansible-tmp-1727203929.5522387-21580-196662713024490 `" ) && sleep 0' 19285 1727203929.56518: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203929.56533: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203929.56756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203929.56769: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203929.56789: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203929.56810: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203929.56916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203929.58911: stdout chunk (state=3): >>>ansible-tmp-1727203929.5522387-21580-196662713024490=/root/.ansible/tmp/ansible-tmp-1727203929.5522387-21580-196662713024490 <<< 19285 1727203929.59216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203929.59219: stdout chunk (state=3): >>><<< 19285 1727203929.59222: stderr chunk (state=3): >>><<< 19285 1727203929.59224: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203929.5522387-21580-196662713024490=/root/.ansible/tmp/ansible-tmp-1727203929.5522387-21580-196662713024490 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203929.59226: variable 'ansible_module_compression' from source: unknown 19285 1727203929.59283: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 19285 1727203929.59368: variable 'ansible_facts' from source: unknown 19285 1727203929.59682: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203929.5522387-21580-196662713024490/AnsiballZ_command.py 19285 1727203929.59845: Sending initial data 19285 1727203929.59856: Sent initial data (156 bytes) 19285 1727203929.61220: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203929.61309: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203929.61461: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203929.61528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203929.63146: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203929.63298: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203929.63382: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmpphbyj7eg /root/.ansible/tmp/ansible-tmp-1727203929.5522387-21580-196662713024490/AnsiballZ_command.py <<< 19285 1727203929.63389: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203929.5522387-21580-196662713024490/AnsiballZ_command.py" <<< 19285 1727203929.63448: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmpphbyj7eg" to remote "/root/.ansible/tmp/ansible-tmp-1727203929.5522387-21580-196662713024490/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203929.5522387-21580-196662713024490/AnsiballZ_command.py" <<< 19285 1727203929.65404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203929.65548: stderr chunk (state=3): >>><<< 19285 1727203929.65552: stdout chunk (state=3): >>><<< 19285 1727203929.65554: done transferring module to remote 19285 1727203929.65741: _low_level_execute_command(): starting 19285 1727203929.65745: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203929.5522387-21580-196662713024490/ /root/.ansible/tmp/ansible-tmp-1727203929.5522387-21580-196662713024490/AnsiballZ_command.py && sleep 0' 19285 1727203929.67023: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203929.67026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203929.67092: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203929.67111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203929.67129: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203929.67236: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203929.69094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203929.69143: stderr chunk (state=3): >>><<< 19285 1727203929.69281: stdout chunk (state=3): >>><<< 19285 1727203929.69289: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203929.69294: _low_level_execute_command(): starting 19285 1727203929.69298: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203929.5522387-21580-196662713024490/AnsiballZ_command.py && sleep 0' 19285 1727203929.70582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203929.70585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203929.70588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203929.70590: stderr chunk (state=3): >>>debug2: match found <<< 19285 1727203929.70592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203929.70740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203929.70744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203929.70837: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203929.86920: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"LSR-TST-br31\"", "rc": 1, "cmd": ["ip", "link", "del", "LSR-TST-br31"], "start": "2024-09-24 14:52:09.860703", "end": "2024-09-24 14:52:09.868165", "delta": "0:00:00.007462", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del LSR-TST-br31", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 19285 1727203929.88483: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.13.254 closed. <<< 19285 1727203929.88494: stdout chunk (state=3): >>><<< 19285 1727203929.88497: stderr chunk (state=3): >>><<< 19285 1727203929.88499: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"LSR-TST-br31\"", "rc": 1, "cmd": ["ip", "link", "del", "LSR-TST-br31"], "start": "2024-09-24 14:52:09.860703", "end": "2024-09-24 14:52:09.868165", "delta": "0:00:00.007462", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del LSR-TST-br31", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.13.254 closed. 19285 1727203929.88699: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203929.5522387-21580-196662713024490/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203929.88703: _low_level_execute_command(): starting 19285 1727203929.88706: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203929.5522387-21580-196662713024490/ > /dev/null 2>&1 && sleep 0' 19285 1727203929.89794: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203929.89802: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203929.89813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203929.89828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203929.89879: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203929.89968: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203929.90222: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203929.90319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203929.92446: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203929.92450: stdout chunk (state=3): >>><<< 19285 1727203929.92456: stderr chunk (state=3): >>><<< 19285 1727203929.92479: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203929.92485: handler run complete 19285 1727203929.92511: Evaluated conditional (False): False 19285 1727203929.92520: attempt loop complete, returning result 19285 1727203929.92523: _execute() done 19285 1727203929.92526: dumping result to json 19285 1727203929.92581: done dumping result, returning 19285 1727203929.92584: done running TaskExecutor() for managed-node2/TASK: Remove test interface if necessary [028d2410-947f-f31b-fb3f-000000000393] 19285 1727203929.92587: sending task result for task 028d2410-947f-f31b-fb3f-000000000393 19285 1727203929.92655: done sending task result for task 028d2410-947f-f31b-fb3f-000000000393 19285 1727203929.92658: WORKER PROCESS EXITING fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": [ "ip", "link", "del", "LSR-TST-br31" ], "delta": "0:00:00.007462", "end": "2024-09-24 14:52:09.868165", "rc": 1, "start": "2024-09-24 14:52:09.860703" } STDERR: Cannot find device "LSR-TST-br31" MSG: non-zero return code ...ignoring 19285 1727203929.92756: no more pending results, returning what we have 19285 1727203929.92759: results queue empty 19285 1727203929.92760: checking for any_errors_fatal 19285 1727203929.92762: done checking for any_errors_fatal 19285 1727203929.92763: checking for max_fail_percentage 19285 1727203929.92765: done checking for max_fail_percentage 19285 1727203929.92766: checking to see if all hosts have failed and the running result is not ok 19285 1727203929.92767: done checking to see if all hosts have failed 19285 1727203929.92768: getting the remaining hosts for this loop 19285 1727203929.92769: done getting the remaining hosts for this loop 19285 1727203929.92772: getting the next task for host managed-node2 19285 1727203929.92887: done getting next task for host managed-node2 19285 1727203929.92890: ^ task is: TASK: meta (flush_handlers) 19285 1727203929.92892: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203929.92897: getting variables 19285 1727203929.92898: in VariableManager get_vars() 19285 1727203929.92925: Calling all_inventory to load vars for managed-node2 19285 1727203929.92927: Calling groups_inventory to load vars for managed-node2 19285 1727203929.92930: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203929.92941: Calling all_plugins_play to load vars for managed-node2 19285 1727203929.92943: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203929.92945: Calling groups_plugins_play to load vars for managed-node2 19285 1727203929.95968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203929.99703: done with get_vars() 19285 1727203929.99855: done getting variables 19285 1727203929.99929: in VariableManager get_vars() 19285 1727203929.99939: Calling all_inventory to load vars for managed-node2 19285 1727203929.99942: Calling groups_inventory to load vars for managed-node2 19285 1727203929.99944: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203930.00078: Calling all_plugins_play to load vars for managed-node2 19285 1727203930.00081: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203930.00085: Calling groups_plugins_play to load vars for managed-node2 19285 1727203930.03351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203930.07725: done with get_vars() 19285 1727203930.07885: done queuing things up, now waiting for results queue to drain 19285 1727203930.07887: results queue empty 19285 1727203930.07888: checking for any_errors_fatal 19285 1727203930.07892: done checking for any_errors_fatal 19285 1727203930.07893: checking for max_fail_percentage 19285 1727203930.07894: done checking for max_fail_percentage 19285 1727203930.07894: checking to see if all hosts have failed and the running result is not ok 19285 1727203930.07895: done checking to see if all hosts have failed 19285 1727203930.07896: getting the remaining hosts for this loop 19285 1727203930.07897: done getting the remaining hosts for this loop 19285 1727203930.07899: getting the next task for host managed-node2 19285 1727203930.07903: done getting next task for host managed-node2 19285 1727203930.07904: ^ task is: TASK: meta (flush_handlers) 19285 1727203930.07906: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203930.07909: getting variables 19285 1727203930.07910: in VariableManager get_vars() 19285 1727203930.07919: Calling all_inventory to load vars for managed-node2 19285 1727203930.07921: Calling groups_inventory to load vars for managed-node2 19285 1727203930.07924: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203930.07929: Calling all_plugins_play to load vars for managed-node2 19285 1727203930.07932: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203930.07934: Calling groups_plugins_play to load vars for managed-node2 19285 1727203930.10325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203930.14319: done with get_vars() 19285 1727203930.14348: done getting variables 19285 1727203930.14523: in VariableManager get_vars() 19285 1727203930.14534: Calling all_inventory to load vars for managed-node2 19285 1727203930.14536: Calling groups_inventory to load vars for managed-node2 19285 1727203930.14539: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203930.14544: Calling all_plugins_play to load vars for managed-node2 19285 1727203930.14546: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203930.14549: Calling groups_plugins_play to load vars for managed-node2 19285 1727203930.17349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203930.21314: done with get_vars() 19285 1727203930.21397: done queuing things up, now waiting for results queue to drain 19285 1727203930.21399: results queue empty 19285 1727203930.21400: checking for any_errors_fatal 19285 1727203930.21401: done checking for any_errors_fatal 19285 1727203930.21402: checking for max_fail_percentage 19285 1727203930.21403: done checking for max_fail_percentage 19285 1727203930.21413: checking to see if all hosts have failed and the running result is not ok 19285 1727203930.21415: done checking to see if all hosts have failed 19285 1727203930.21416: getting the remaining hosts for this loop 19285 1727203930.21417: done getting the remaining hosts for this loop 19285 1727203930.21420: getting the next task for host managed-node2 19285 1727203930.21424: done getting next task for host managed-node2 19285 1727203930.21425: ^ task is: None 19285 1727203930.21426: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203930.21428: done queuing things up, now waiting for results queue to drain 19285 1727203930.21429: results queue empty 19285 1727203930.21429: checking for any_errors_fatal 19285 1727203930.21430: done checking for any_errors_fatal 19285 1727203930.21431: checking for max_fail_percentage 19285 1727203930.21432: done checking for max_fail_percentage 19285 1727203930.21433: checking to see if all hosts have failed and the running result is not ok 19285 1727203930.21433: done checking to see if all hosts have failed 19285 1727203930.21434: getting the next task for host managed-node2 19285 1727203930.21437: done getting next task for host managed-node2 19285 1727203930.21438: ^ task is: None 19285 1727203930.21439: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203930.21714: in VariableManager get_vars() 19285 1727203930.21738: done with get_vars() 19285 1727203930.21745: in VariableManager get_vars() 19285 1727203930.21758: done with get_vars() 19285 1727203930.21763: variable 'omit' from source: magic vars 19285 1727203930.22084: variable 'profile' from source: play vars 19285 1727203930.22280: in VariableManager get_vars() 19285 1727203930.22295: done with get_vars() 19285 1727203930.22316: variable 'omit' from source: magic vars 19285 1727203930.22387: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 19285 1727203930.23799: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19285 1727203930.23865: getting the remaining hosts for this loop 19285 1727203930.23867: done getting the remaining hosts for this loop 19285 1727203930.23870: getting the next task for host managed-node2 19285 1727203930.23873: done getting next task for host managed-node2 19285 1727203930.23880: ^ task is: TASK: Gathering Facts 19285 1727203930.23881: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203930.23883: getting variables 19285 1727203930.23886: in VariableManager get_vars() 19285 1727203930.23922: Calling all_inventory to load vars for managed-node2 19285 1727203930.23925: Calling groups_inventory to load vars for managed-node2 19285 1727203930.23927: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203930.23933: Calling all_plugins_play to load vars for managed-node2 19285 1727203930.23935: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203930.23938: Calling groups_plugins_play to load vars for managed-node2 19285 1727203930.26301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203930.28295: done with get_vars() 19285 1727203930.28322: done getting variables 19285 1727203930.28374: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Tuesday 24 September 2024 14:52:10 -0400 (0:00:00.814) 0:00:29.358 ***** 19285 1727203930.28437: entering _queue_task() for managed-node2/gather_facts 19285 1727203930.29164: worker is 1 (out of 1 available) 19285 1727203930.29177: exiting _queue_task() for managed-node2/gather_facts 19285 1727203930.29188: done queuing things up, now waiting for results queue to drain 19285 1727203930.29190: waiting for pending results... 19285 1727203930.29461: running TaskExecutor() for managed-node2/TASK: Gathering Facts 19285 1727203930.29842: in run() - task 028d2410-947f-f31b-fb3f-0000000003a1 19285 1727203930.29848: variable 'ansible_search_path' from source: unknown 19285 1727203930.29853: calling self._execute() 19285 1727203930.30096: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203930.30107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203930.30121: variable 'omit' from source: magic vars 19285 1727203930.30572: variable 'ansible_distribution_major_version' from source: facts 19285 1727203930.30628: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203930.30680: variable 'omit' from source: magic vars 19285 1727203930.30683: variable 'omit' from source: magic vars 19285 1727203930.30818: variable 'omit' from source: magic vars 19285 1727203930.30897: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203930.30968: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203930.31021: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203930.31066: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203930.31136: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203930.31243: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203930.31246: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203930.31249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203930.31567: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203930.31572: Set connection var ansible_pipelining to False 19285 1727203930.31574: Set connection var ansible_timeout to 10 19285 1727203930.31579: Set connection var ansible_shell_type to sh 19285 1727203930.31581: Set connection var ansible_shell_executable to /bin/sh 19285 1727203930.31583: Set connection var ansible_connection to ssh 19285 1727203930.31585: variable 'ansible_shell_executable' from source: unknown 19285 1727203930.31587: variable 'ansible_connection' from source: unknown 19285 1727203930.31589: variable 'ansible_module_compression' from source: unknown 19285 1727203930.31591: variable 'ansible_shell_type' from source: unknown 19285 1727203930.31593: variable 'ansible_shell_executable' from source: unknown 19285 1727203930.31727: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203930.31730: variable 'ansible_pipelining' from source: unknown 19285 1727203930.31733: variable 'ansible_timeout' from source: unknown 19285 1727203930.31735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203930.32087: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203930.32091: variable 'omit' from source: magic vars 19285 1727203930.32094: starting attempt loop 19285 1727203930.32096: running the handler 19285 1727203930.32098: variable 'ansible_facts' from source: unknown 19285 1727203930.32101: _low_level_execute_command(): starting 19285 1727203930.32103: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203930.33143: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203930.33219: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203930.33262: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203930.33298: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203930.33478: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203930.35119: stdout chunk (state=3): >>>/root <<< 19285 1727203930.35235: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203930.35239: stdout chunk (state=3): >>><<< 19285 1727203930.35241: stderr chunk (state=3): >>><<< 19285 1727203930.35582: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203930.35586: _low_level_execute_command(): starting 19285 1727203930.35589: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203930.3528416-21602-79281391389000 `" && echo ansible-tmp-1727203930.3528416-21602-79281391389000="` echo /root/.ansible/tmp/ansible-tmp-1727203930.3528416-21602-79281391389000 `" ) && sleep 0' 19285 1727203930.36692: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203930.36725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 19285 1727203930.36736: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203930.36812: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203930.37027: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203930.38932: stdout chunk (state=3): >>>ansible-tmp-1727203930.3528416-21602-79281391389000=/root/.ansible/tmp/ansible-tmp-1727203930.3528416-21602-79281391389000 <<< 19285 1727203930.39350: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203930.39373: stderr chunk (state=3): >>><<< 19285 1727203930.39448: stdout chunk (state=3): >>><<< 19285 1727203930.39453: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203930.3528416-21602-79281391389000=/root/.ansible/tmp/ansible-tmp-1727203930.3528416-21602-79281391389000 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203930.39685: variable 'ansible_module_compression' from source: unknown 19285 1727203930.39690: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19285 1727203930.39916: variable 'ansible_facts' from source: unknown 19285 1727203930.40352: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203930.3528416-21602-79281391389000/AnsiballZ_setup.py 19285 1727203930.40622: Sending initial data 19285 1727203930.40632: Sent initial data (153 bytes) 19285 1727203930.41784: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203930.41801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203930.41905: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 19285 1727203930.41974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203930.42108: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203930.42189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203930.43770: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203930.43918: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203930.44044: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmp4a1qddaf /root/.ansible/tmp/ansible-tmp-1727203930.3528416-21602-79281391389000/AnsiballZ_setup.py <<< 19285 1727203930.44053: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203930.3528416-21602-79281391389000/AnsiballZ_setup.py" <<< 19285 1727203930.44180: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmp4a1qddaf" to remote "/root/.ansible/tmp/ansible-tmp-1727203930.3528416-21602-79281391389000/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203930.3528416-21602-79281391389000/AnsiballZ_setup.py" <<< 19285 1727203930.47687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203930.47714: stderr chunk (state=3): >>><<< 19285 1727203930.47852: stdout chunk (state=3): >>><<< 19285 1727203930.47855: done transferring module to remote 19285 1727203930.47858: _low_level_execute_command(): starting 19285 1727203930.47862: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203930.3528416-21602-79281391389000/ /root/.ansible/tmp/ansible-tmp-1727203930.3528416-21602-79281391389000/AnsiballZ_setup.py && sleep 0' 19285 1727203930.48528: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203930.48582: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203930.48598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203930.48678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203930.48698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203930.48745: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203930.48839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203930.50831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203930.50887: stderr chunk (state=3): >>><<< 19285 1727203930.50945: stdout chunk (state=3): >>><<< 19285 1727203930.51090: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203930.51095: _low_level_execute_command(): starting 19285 1727203930.51098: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203930.3528416-21602-79281391389000/AnsiballZ_setup.py && sleep 0' 19285 1727203930.52284: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203930.52303: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203930.52369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203930.52393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203930.52501: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203930.52523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203930.52547: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203930.52703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203931.15407: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.6455078125, "5m": 0.43017578125, "15m": 0.212890625}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2928, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 603, "free": 2928}, "nocache": {"free": 3284, "used": 247}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 517, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261787811840, "block_size": 4096, "block_total": 65519099, "block_available": 63913040, "block_used": 1606059, "inode_total": 131070960, "inode_available": 131027263, "inode_used": 43697, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "52", "second": "11", "epoch": "1727203931", "epoch_int": "1727203931", "date": "2024-09-24", "time": "14:52:11", "iso8601_micro": "2024-09-24T18:52:11.150354Z", "iso8601": "2024-09-24T18:52:11Z", "iso8601_basic": "20240924T145211150354", "iso8601_basic_short": "20240924T145211", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19285 1727203931.17401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203931.17435: stderr chunk (state=3): >>><<< 19285 1727203931.17440: stdout chunk (state=3): >>><<< 19285 1727203931.17683: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.6455078125, "5m": 0.43017578125, "15m": 0.212890625}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2928, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 603, "free": 2928}, "nocache": {"free": 3284, "used": 247}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 517, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261787811840, "block_size": 4096, "block_total": 65519099, "block_available": 63913040, "block_used": 1606059, "inode_total": 131070960, "inode_available": 131027263, "inode_used": 43697, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "52", "second": "11", "epoch": "1727203931", "epoch_int": "1727203931", "date": "2024-09-24", "time": "14:52:11", "iso8601_micro": "2024-09-24T18:52:11.150354Z", "iso8601": "2024-09-24T18:52:11Z", "iso8601_basic": "20240924T145211150354", "iso8601_basic_short": "20240924T145211", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203931.17857: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203930.3528416-21602-79281391389000/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203931.17888: _low_level_execute_command(): starting 19285 1727203931.17899: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203930.3528416-21602-79281391389000/ > /dev/null 2>&1 && sleep 0' 19285 1727203931.18539: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203931.18552: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203931.18582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203931.18688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203931.18707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203931.18821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203931.20881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203931.20884: stdout chunk (state=3): >>><<< 19285 1727203931.20887: stderr chunk (state=3): >>><<< 19285 1727203931.20890: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203931.20892: handler run complete 19285 1727203931.20894: variable 'ansible_facts' from source: unknown 19285 1727203931.21001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203931.21336: variable 'ansible_facts' from source: unknown 19285 1727203931.21433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203931.21594: attempt loop complete, returning result 19285 1727203931.21604: _execute() done 19285 1727203931.21611: dumping result to json 19285 1727203931.21645: done dumping result, returning 19285 1727203931.21659: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [028d2410-947f-f31b-fb3f-0000000003a1] 19285 1727203931.21786: sending task result for task 028d2410-947f-f31b-fb3f-0000000003a1 19285 1727203931.22225: done sending task result for task 028d2410-947f-f31b-fb3f-0000000003a1 19285 1727203931.22229: WORKER PROCESS EXITING ok: [managed-node2] 19285 1727203931.22710: no more pending results, returning what we have 19285 1727203931.22713: results queue empty 19285 1727203931.22714: checking for any_errors_fatal 19285 1727203931.22716: done checking for any_errors_fatal 19285 1727203931.22716: checking for max_fail_percentage 19285 1727203931.22718: done checking for max_fail_percentage 19285 1727203931.22719: checking to see if all hosts have failed and the running result is not ok 19285 1727203931.22720: done checking to see if all hosts have failed 19285 1727203931.22721: getting the remaining hosts for this loop 19285 1727203931.22722: done getting the remaining hosts for this loop 19285 1727203931.22725: getting the next task for host managed-node2 19285 1727203931.22730: done getting next task for host managed-node2 19285 1727203931.22732: ^ task is: TASK: meta (flush_handlers) 19285 1727203931.22734: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203931.22737: getting variables 19285 1727203931.22739: in VariableManager get_vars() 19285 1727203931.22766: Calling all_inventory to load vars for managed-node2 19285 1727203931.22769: Calling groups_inventory to load vars for managed-node2 19285 1727203931.22771: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203931.22783: Calling all_plugins_play to load vars for managed-node2 19285 1727203931.22786: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203931.22789: Calling groups_plugins_play to load vars for managed-node2 19285 1727203931.24164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203931.25873: done with get_vars() 19285 1727203931.25898: done getting variables 19285 1727203931.25979: in VariableManager get_vars() 19285 1727203931.25992: Calling all_inventory to load vars for managed-node2 19285 1727203931.25994: Calling groups_inventory to load vars for managed-node2 19285 1727203931.25996: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203931.26001: Calling all_plugins_play to load vars for managed-node2 19285 1727203931.26003: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203931.26005: Calling groups_plugins_play to load vars for managed-node2 19285 1727203931.27361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203931.29168: done with get_vars() 19285 1727203931.29236: done queuing things up, now waiting for results queue to drain 19285 1727203931.29238: results queue empty 19285 1727203931.29239: checking for any_errors_fatal 19285 1727203931.29243: done checking for any_errors_fatal 19285 1727203931.29244: checking for max_fail_percentage 19285 1727203931.29245: done checking for max_fail_percentage 19285 1727203931.29246: checking to see if all hosts have failed and the running result is not ok 19285 1727203931.29250: done checking to see if all hosts have failed 19285 1727203931.29251: getting the remaining hosts for this loop 19285 1727203931.29252: done getting the remaining hosts for this loop 19285 1727203931.29259: getting the next task for host managed-node2 19285 1727203931.29263: done getting next task for host managed-node2 19285 1727203931.29266: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 19285 1727203931.29267: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203931.29308: getting variables 19285 1727203931.29310: in VariableManager get_vars() 19285 1727203931.29363: Calling all_inventory to load vars for managed-node2 19285 1727203931.29365: Calling groups_inventory to load vars for managed-node2 19285 1727203931.29367: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203931.29371: Calling all_plugins_play to load vars for managed-node2 19285 1727203931.29378: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203931.29381: Calling groups_plugins_play to load vars for managed-node2 19285 1727203931.30157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203931.31303: done with get_vars() 19285 1727203931.31329: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:52:11 -0400 (0:00:01.029) 0:00:30.388 ***** 19285 1727203931.31435: entering _queue_task() for managed-node2/include_tasks 19285 1727203931.31880: worker is 1 (out of 1 available) 19285 1727203931.31893: exiting _queue_task() for managed-node2/include_tasks 19285 1727203931.31904: done queuing things up, now waiting for results queue to drain 19285 1727203931.31906: waiting for pending results... 19285 1727203931.32203: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 19285 1727203931.32354: in run() - task 028d2410-947f-f31b-fb3f-00000000005a 19285 1727203931.32359: variable 'ansible_search_path' from source: unknown 19285 1727203931.32361: variable 'ansible_search_path' from source: unknown 19285 1727203931.32365: calling self._execute() 19285 1727203931.32473: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203931.32480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203931.32492: variable 'omit' from source: magic vars 19285 1727203931.32775: variable 'ansible_distribution_major_version' from source: facts 19285 1727203931.32786: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203931.32795: _execute() done 19285 1727203931.32798: dumping result to json 19285 1727203931.32801: done dumping result, returning 19285 1727203931.32806: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [028d2410-947f-f31b-fb3f-00000000005a] 19285 1727203931.32812: sending task result for task 028d2410-947f-f31b-fb3f-00000000005a 19285 1727203931.32898: done sending task result for task 028d2410-947f-f31b-fb3f-00000000005a 19285 1727203931.32900: WORKER PROCESS EXITING 19285 1727203931.32973: no more pending results, returning what we have 19285 1727203931.32980: in VariableManager get_vars() 19285 1727203931.33027: Calling all_inventory to load vars for managed-node2 19285 1727203931.33030: Calling groups_inventory to load vars for managed-node2 19285 1727203931.33033: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203931.33042: Calling all_plugins_play to load vars for managed-node2 19285 1727203931.33044: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203931.33046: Calling groups_plugins_play to load vars for managed-node2 19285 1727203931.33896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203931.35104: done with get_vars() 19285 1727203931.35127: variable 'ansible_search_path' from source: unknown 19285 1727203931.35128: variable 'ansible_search_path' from source: unknown 19285 1727203931.35156: we have included files to process 19285 1727203931.35161: generating all_blocks data 19285 1727203931.35174: done generating all_blocks data 19285 1727203931.35177: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 19285 1727203931.35179: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 19285 1727203931.35182: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 19285 1727203931.35794: done processing included file 19285 1727203931.35796: iterating over new_blocks loaded from include file 19285 1727203931.35798: in VariableManager get_vars() 19285 1727203931.35820: done with get_vars() 19285 1727203931.35821: filtering new block on tags 19285 1727203931.35833: done filtering new block on tags 19285 1727203931.35835: in VariableManager get_vars() 19285 1727203931.35846: done with get_vars() 19285 1727203931.35847: filtering new block on tags 19285 1727203931.35861: done filtering new block on tags 19285 1727203931.35863: in VariableManager get_vars() 19285 1727203931.35874: done with get_vars() 19285 1727203931.35877: filtering new block on tags 19285 1727203931.35900: done filtering new block on tags 19285 1727203931.35902: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 19285 1727203931.35907: extending task lists for all hosts with included blocks 19285 1727203931.36329: done extending task lists 19285 1727203931.36330: done processing included files 19285 1727203931.36330: results queue empty 19285 1727203931.36330: checking for any_errors_fatal 19285 1727203931.36331: done checking for any_errors_fatal 19285 1727203931.36332: checking for max_fail_percentage 19285 1727203931.36333: done checking for max_fail_percentage 19285 1727203931.36333: checking to see if all hosts have failed and the running result is not ok 19285 1727203931.36333: done checking to see if all hosts have failed 19285 1727203931.36334: getting the remaining hosts for this loop 19285 1727203931.36335: done getting the remaining hosts for this loop 19285 1727203931.36336: getting the next task for host managed-node2 19285 1727203931.36339: done getting next task for host managed-node2 19285 1727203931.36341: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 19285 1727203931.36343: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203931.36349: getting variables 19285 1727203931.36349: in VariableManager get_vars() 19285 1727203931.36369: Calling all_inventory to load vars for managed-node2 19285 1727203931.36372: Calling groups_inventory to load vars for managed-node2 19285 1727203931.36377: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203931.36382: Calling all_plugins_play to load vars for managed-node2 19285 1727203931.36385: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203931.36391: Calling groups_plugins_play to load vars for managed-node2 19285 1727203931.37569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203931.39232: done with get_vars() 19285 1727203931.39257: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:52:11 -0400 (0:00:00.079) 0:00:30.467 ***** 19285 1727203931.39340: entering _queue_task() for managed-node2/setup 19285 1727203931.39728: worker is 1 (out of 1 available) 19285 1727203931.39741: exiting _queue_task() for managed-node2/setup 19285 1727203931.39751: done queuing things up, now waiting for results queue to drain 19285 1727203931.39753: waiting for pending results... 19285 1727203931.40320: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 19285 1727203931.40632: in run() - task 028d2410-947f-f31b-fb3f-0000000003e2 19285 1727203931.40636: variable 'ansible_search_path' from source: unknown 19285 1727203931.40639: variable 'ansible_search_path' from source: unknown 19285 1727203931.40685: calling self._execute() 19285 1727203931.40856: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203931.40863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203931.40880: variable 'omit' from source: magic vars 19285 1727203931.41383: variable 'ansible_distribution_major_version' from source: facts 19285 1727203931.41393: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203931.41542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203931.43582: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203931.43586: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203931.43601: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203931.43648: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203931.43681: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203931.43915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203931.43948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203931.44000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203931.44046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203931.44060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203931.44282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203931.44285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203931.44287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203931.44305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203931.44335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203931.44691: variable '__network_required_facts' from source: role '' defaults 19285 1727203931.44721: variable 'ansible_facts' from source: unknown 19285 1727203931.46281: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 19285 1727203931.46286: when evaluation is False, skipping this task 19285 1727203931.46288: _execute() done 19285 1727203931.46290: dumping result to json 19285 1727203931.46292: done dumping result, returning 19285 1727203931.46295: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [028d2410-947f-f31b-fb3f-0000000003e2] 19285 1727203931.46297: sending task result for task 028d2410-947f-f31b-fb3f-0000000003e2 19285 1727203931.46594: done sending task result for task 028d2410-947f-f31b-fb3f-0000000003e2 19285 1727203931.46597: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19285 1727203931.46672: no more pending results, returning what we have 19285 1727203931.46682: results queue empty 19285 1727203931.46683: checking for any_errors_fatal 19285 1727203931.46686: done checking for any_errors_fatal 19285 1727203931.46686: checking for max_fail_percentage 19285 1727203931.46688: done checking for max_fail_percentage 19285 1727203931.46689: checking to see if all hosts have failed and the running result is not ok 19285 1727203931.46690: done checking to see if all hosts have failed 19285 1727203931.46691: getting the remaining hosts for this loop 19285 1727203931.46693: done getting the remaining hosts for this loop 19285 1727203931.46697: getting the next task for host managed-node2 19285 1727203931.46708: done getting next task for host managed-node2 19285 1727203931.46713: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 19285 1727203931.46718: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203931.46733: getting variables 19285 1727203931.46735: in VariableManager get_vars() 19285 1727203931.46809: Calling all_inventory to load vars for managed-node2 19285 1727203931.46818: Calling groups_inventory to load vars for managed-node2 19285 1727203931.46825: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203931.46842: Calling all_plugins_play to load vars for managed-node2 19285 1727203931.46845: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203931.46848: Calling groups_plugins_play to load vars for managed-node2 19285 1727203931.51919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203931.56769: done with get_vars() 19285 1727203931.56812: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:52:11 -0400 (0:00:00.176) 0:00:30.644 ***** 19285 1727203931.56994: entering _queue_task() for managed-node2/stat 19285 1727203931.57527: worker is 1 (out of 1 available) 19285 1727203931.57547: exiting _queue_task() for managed-node2/stat 19285 1727203931.57614: done queuing things up, now waiting for results queue to drain 19285 1727203931.57615: waiting for pending results... 19285 1727203931.57892: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 19285 1727203931.57941: in run() - task 028d2410-947f-f31b-fb3f-0000000003e4 19285 1727203931.57961: variable 'ansible_search_path' from source: unknown 19285 1727203931.57969: variable 'ansible_search_path' from source: unknown 19285 1727203931.58024: calling self._execute() 19285 1727203931.58133: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203931.58145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203931.58160: variable 'omit' from source: magic vars 19285 1727203931.58546: variable 'ansible_distribution_major_version' from source: facts 19285 1727203931.58565: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203931.58717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203931.58989: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203931.59047: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203931.59093: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203931.59280: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203931.59287: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203931.59290: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203931.59301: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203931.59338: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203931.59447: variable '__network_is_ostree' from source: set_fact 19285 1727203931.59466: Evaluated conditional (not __network_is_ostree is defined): False 19285 1727203931.59484: when evaluation is False, skipping this task 19285 1727203931.59491: _execute() done 19285 1727203931.59498: dumping result to json 19285 1727203931.59506: done dumping result, returning 19285 1727203931.59517: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [028d2410-947f-f31b-fb3f-0000000003e4] 19285 1727203931.59530: sending task result for task 028d2410-947f-f31b-fb3f-0000000003e4 19285 1727203931.59635: done sending task result for task 028d2410-947f-f31b-fb3f-0000000003e4 19285 1727203931.59642: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 19285 1727203931.59836: no more pending results, returning what we have 19285 1727203931.59840: results queue empty 19285 1727203931.59841: checking for any_errors_fatal 19285 1727203931.59846: done checking for any_errors_fatal 19285 1727203931.59847: checking for max_fail_percentage 19285 1727203931.59848: done checking for max_fail_percentage 19285 1727203931.59849: checking to see if all hosts have failed and the running result is not ok 19285 1727203931.59850: done checking to see if all hosts have failed 19285 1727203931.59852: getting the remaining hosts for this loop 19285 1727203931.59853: done getting the remaining hosts for this loop 19285 1727203931.59856: getting the next task for host managed-node2 19285 1727203931.59862: done getting next task for host managed-node2 19285 1727203931.59865: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 19285 1727203931.59868: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203931.59882: getting variables 19285 1727203931.59883: in VariableManager get_vars() 19285 1727203931.59918: Calling all_inventory to load vars for managed-node2 19285 1727203931.59923: Calling groups_inventory to load vars for managed-node2 19285 1727203931.59927: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203931.59935: Calling all_plugins_play to load vars for managed-node2 19285 1727203931.59938: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203931.59940: Calling groups_plugins_play to load vars for managed-node2 19285 1727203931.63198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203931.65486: done with get_vars() 19285 1727203931.65529: done getting variables 19285 1727203931.65835: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:52:11 -0400 (0:00:00.088) 0:00:30.733 ***** 19285 1727203931.65884: entering _queue_task() for managed-node2/set_fact 19285 1727203931.66620: worker is 1 (out of 1 available) 19285 1727203931.66632: exiting _queue_task() for managed-node2/set_fact 19285 1727203931.66645: done queuing things up, now waiting for results queue to drain 19285 1727203931.66647: waiting for pending results... 19285 1727203931.66944: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 19285 1727203931.67104: in run() - task 028d2410-947f-f31b-fb3f-0000000003e5 19285 1727203931.67107: variable 'ansible_search_path' from source: unknown 19285 1727203931.67111: variable 'ansible_search_path' from source: unknown 19285 1727203931.67129: calling self._execute() 19285 1727203931.67290: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203931.67376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203931.67428: variable 'omit' from source: magic vars 19285 1727203931.68482: variable 'ansible_distribution_major_version' from source: facts 19285 1727203931.68486: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203931.69219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203931.69932: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203931.70095: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203931.70156: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203931.70190: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203931.70420: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203931.70548: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203931.70586: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203931.70612: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203931.70708: variable '__network_is_ostree' from source: set_fact 19285 1727203931.70716: Evaluated conditional (not __network_is_ostree is defined): False 19285 1727203931.70719: when evaluation is False, skipping this task 19285 1727203931.70722: _execute() done 19285 1727203931.70725: dumping result to json 19285 1727203931.70727: done dumping result, returning 19285 1727203931.70781: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [028d2410-947f-f31b-fb3f-0000000003e5] 19285 1727203931.70789: sending task result for task 028d2410-947f-f31b-fb3f-0000000003e5 19285 1727203931.70969: done sending task result for task 028d2410-947f-f31b-fb3f-0000000003e5 19285 1727203931.70975: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 19285 1727203931.71027: no more pending results, returning what we have 19285 1727203931.71031: results queue empty 19285 1727203931.71032: checking for any_errors_fatal 19285 1727203931.71040: done checking for any_errors_fatal 19285 1727203931.71041: checking for max_fail_percentage 19285 1727203931.71043: done checking for max_fail_percentage 19285 1727203931.71044: checking to see if all hosts have failed and the running result is not ok 19285 1727203931.71045: done checking to see if all hosts have failed 19285 1727203931.71046: getting the remaining hosts for this loop 19285 1727203931.71047: done getting the remaining hosts for this loop 19285 1727203931.71052: getting the next task for host managed-node2 19285 1727203931.71064: done getting next task for host managed-node2 19285 1727203931.71068: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 19285 1727203931.71072: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203931.71087: getting variables 19285 1727203931.71089: in VariableManager get_vars() 19285 1727203931.71131: Calling all_inventory to load vars for managed-node2 19285 1727203931.71135: Calling groups_inventory to load vars for managed-node2 19285 1727203931.71138: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203931.71149: Calling all_plugins_play to load vars for managed-node2 19285 1727203931.71152: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203931.71155: Calling groups_plugins_play to load vars for managed-node2 19285 1727203931.72354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203931.73601: done with get_vars() 19285 1727203931.73629: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:52:11 -0400 (0:00:00.078) 0:00:30.811 ***** 19285 1727203931.73728: entering _queue_task() for managed-node2/service_facts 19285 1727203931.74113: worker is 1 (out of 1 available) 19285 1727203931.74132: exiting _queue_task() for managed-node2/service_facts 19285 1727203931.74144: done queuing things up, now waiting for results queue to drain 19285 1727203931.74145: waiting for pending results... 19285 1727203931.74693: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 19285 1727203931.74698: in run() - task 028d2410-947f-f31b-fb3f-0000000003e7 19285 1727203931.74701: variable 'ansible_search_path' from source: unknown 19285 1727203931.74704: variable 'ansible_search_path' from source: unknown 19285 1727203931.74707: calling self._execute() 19285 1727203931.74987: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203931.74990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203931.74994: variable 'omit' from source: magic vars 19285 1727203931.75286: variable 'ansible_distribution_major_version' from source: facts 19285 1727203931.75301: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203931.75313: variable 'omit' from source: magic vars 19285 1727203931.75489: variable 'omit' from source: magic vars 19285 1727203931.75493: variable 'omit' from source: magic vars 19285 1727203931.75641: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203931.75645: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203931.75647: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203931.75665: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203931.75682: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203931.75747: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203931.75750: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203931.75753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203931.75825: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203931.75835: Set connection var ansible_pipelining to False 19285 1727203931.75847: Set connection var ansible_timeout to 10 19285 1727203931.75855: Set connection var ansible_shell_type to sh 19285 1727203931.75857: Set connection var ansible_shell_executable to /bin/sh 19285 1727203931.75860: Set connection var ansible_connection to ssh 19285 1727203931.75961: variable 'ansible_shell_executable' from source: unknown 19285 1727203931.75965: variable 'ansible_connection' from source: unknown 19285 1727203931.76008: variable 'ansible_module_compression' from source: unknown 19285 1727203931.76116: variable 'ansible_shell_type' from source: unknown 19285 1727203931.76119: variable 'ansible_shell_executable' from source: unknown 19285 1727203931.76121: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203931.76124: variable 'ansible_pipelining' from source: unknown 19285 1727203931.76126: variable 'ansible_timeout' from source: unknown 19285 1727203931.76127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203931.76479: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19285 1727203931.76489: variable 'omit' from source: magic vars 19285 1727203931.76492: starting attempt loop 19285 1727203931.76739: running the handler 19285 1727203931.76742: _low_level_execute_command(): starting 19285 1727203931.76744: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203931.78192: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203931.78196: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203931.78199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203931.78202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203931.78204: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203931.78207: stderr chunk (state=3): >>>debug2: match not found <<< 19285 1727203931.78209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203931.78212: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19285 1727203931.78214: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 19285 1727203931.78216: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19285 1727203931.78218: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203931.78220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203931.78222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203931.78224: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203931.78331: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203931.78334: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203931.78336: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203931.78434: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203931.80139: stdout chunk (state=3): >>>/root <<< 19285 1727203931.80519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203931.80522: stdout chunk (state=3): >>><<< 19285 1727203931.80525: stderr chunk (state=3): >>><<< 19285 1727203931.80529: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203931.80532: _low_level_execute_command(): starting 19285 1727203931.80534: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203931.8030646-21669-9754083269445 `" && echo ansible-tmp-1727203931.8030646-21669-9754083269445="` echo /root/.ansible/tmp/ansible-tmp-1727203931.8030646-21669-9754083269445 `" ) && sleep 0' 19285 1727203931.81441: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203931.81639: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203931.81642: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203931.81752: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203931.81892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203931.83815: stdout chunk (state=3): >>>ansible-tmp-1727203931.8030646-21669-9754083269445=/root/.ansible/tmp/ansible-tmp-1727203931.8030646-21669-9754083269445 <<< 19285 1727203931.83959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203931.83966: stdout chunk (state=3): >>><<< 19285 1727203931.83973: stderr chunk (state=3): >>><<< 19285 1727203931.84024: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203931.8030646-21669-9754083269445=/root/.ansible/tmp/ansible-tmp-1727203931.8030646-21669-9754083269445 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203931.84083: variable 'ansible_module_compression' from source: unknown 19285 1727203931.84493: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 19285 1727203931.84496: variable 'ansible_facts' from source: unknown 19285 1727203931.84498: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203931.8030646-21669-9754083269445/AnsiballZ_service_facts.py 19285 1727203931.84635: Sending initial data 19285 1727203931.84638: Sent initial data (160 bytes) 19285 1727203931.85512: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203931.85531: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203931.85547: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203931.85574: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203931.85671: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203931.87281: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203931.87350: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203931.87434: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmpx9gn_2ad /root/.ansible/tmp/ansible-tmp-1727203931.8030646-21669-9754083269445/AnsiballZ_service_facts.py <<< 19285 1727203931.87438: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203931.8030646-21669-9754083269445/AnsiballZ_service_facts.py" <<< 19285 1727203931.87515: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmpx9gn_2ad" to remote "/root/.ansible/tmp/ansible-tmp-1727203931.8030646-21669-9754083269445/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203931.8030646-21669-9754083269445/AnsiballZ_service_facts.py" <<< 19285 1727203931.88653: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203931.88990: stderr chunk (state=3): >>><<< 19285 1727203931.88994: stdout chunk (state=3): >>><<< 19285 1727203931.88996: done transferring module to remote 19285 1727203931.88998: _low_level_execute_command(): starting 19285 1727203931.89001: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203931.8030646-21669-9754083269445/ /root/.ansible/tmp/ansible-tmp-1727203931.8030646-21669-9754083269445/AnsiballZ_service_facts.py && sleep 0' 19285 1727203931.90013: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203931.90028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203931.90044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203931.90208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203931.90229: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203931.90278: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203931.90429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203931.92397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203931.92427: stderr chunk (state=3): >>><<< 19285 1727203931.92436: stdout chunk (state=3): >>><<< 19285 1727203931.92457: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203931.92662: _low_level_execute_command(): starting 19285 1727203931.92666: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203931.8030646-21669-9754083269445/AnsiballZ_service_facts.py && sleep 0' 19285 1727203931.93788: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203931.93993: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203931.94115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203933.46628: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 19285 1727203933.46669: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "<<< 19285 1727203933.46705: stdout chunk (state=3): >>>systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 19285 1727203933.48165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203933.48172: stderr chunk (state=3): >>><<< 19285 1727203933.48176: stdout chunk (state=3): >>><<< 19285 1727203933.48206: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203933.54522: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203931.8030646-21669-9754083269445/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203933.54533: _low_level_execute_command(): starting 19285 1727203933.54538: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203931.8030646-21669-9754083269445/ > /dev/null 2>&1 && sleep 0' 19285 1727203933.55286: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203933.55290: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203933.55392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203933.57582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203933.57586: stderr chunk (state=3): >>><<< 19285 1727203933.57588: stdout chunk (state=3): >>><<< 19285 1727203933.57591: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203933.57593: handler run complete 19285 1727203933.57690: variable 'ansible_facts' from source: unknown 19285 1727203933.57844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203933.58408: variable 'ansible_facts' from source: unknown 19285 1727203933.58547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203933.58764: attempt loop complete, returning result 19285 1727203933.58779: _execute() done 19285 1727203933.58787: dumping result to json 19285 1727203933.58854: done dumping result, returning 19285 1727203933.58871: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [028d2410-947f-f31b-fb3f-0000000003e7] 19285 1727203933.58882: sending task result for task 028d2410-947f-f31b-fb3f-0000000003e7 19285 1727203933.65231: done sending task result for task 028d2410-947f-f31b-fb3f-0000000003e7 19285 1727203933.65235: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19285 1727203933.65327: no more pending results, returning what we have 19285 1727203933.65329: results queue empty 19285 1727203933.65330: checking for any_errors_fatal 19285 1727203933.65333: done checking for any_errors_fatal 19285 1727203933.65334: checking for max_fail_percentage 19285 1727203933.65335: done checking for max_fail_percentage 19285 1727203933.65336: checking to see if all hosts have failed and the running result is not ok 19285 1727203933.65337: done checking to see if all hosts have failed 19285 1727203933.65338: getting the remaining hosts for this loop 19285 1727203933.65339: done getting the remaining hosts for this loop 19285 1727203933.65342: getting the next task for host managed-node2 19285 1727203933.65348: done getting next task for host managed-node2 19285 1727203933.65350: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 19285 1727203933.65352: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203933.65360: getting variables 19285 1727203933.65361: in VariableManager get_vars() 19285 1727203933.65387: Calling all_inventory to load vars for managed-node2 19285 1727203933.65390: Calling groups_inventory to load vars for managed-node2 19285 1727203933.65392: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203933.65398: Calling all_plugins_play to load vars for managed-node2 19285 1727203933.65401: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203933.65403: Calling groups_plugins_play to load vars for managed-node2 19285 1727203933.67057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203933.69184: done with get_vars() 19285 1727203933.69213: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:52:13 -0400 (0:00:01.955) 0:00:32.767 ***** 19285 1727203933.69306: entering _queue_task() for managed-node2/package_facts 19285 1727203933.69683: worker is 1 (out of 1 available) 19285 1727203933.69700: exiting _queue_task() for managed-node2/package_facts 19285 1727203933.69712: done queuing things up, now waiting for results queue to drain 19285 1727203933.69713: waiting for pending results... 19285 1727203933.69978: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 19285 1727203933.70123: in run() - task 028d2410-947f-f31b-fb3f-0000000003e8 19285 1727203933.70145: variable 'ansible_search_path' from source: unknown 19285 1727203933.70151: variable 'ansible_search_path' from source: unknown 19285 1727203933.70192: calling self._execute() 19285 1727203933.70295: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203933.70306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203933.70323: variable 'omit' from source: magic vars 19285 1727203933.70730: variable 'ansible_distribution_major_version' from source: facts 19285 1727203933.70752: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203933.70768: variable 'omit' from source: magic vars 19285 1727203933.70856: variable 'omit' from source: magic vars 19285 1727203933.70867: variable 'omit' from source: magic vars 19285 1727203933.70919: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203933.70968: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203933.71002: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203933.71076: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203933.71079: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203933.71082: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203933.71090: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203933.71098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203933.71224: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203933.71238: Set connection var ansible_pipelining to False 19285 1727203933.71249: Set connection var ansible_timeout to 10 19285 1727203933.71257: Set connection var ansible_shell_type to sh 19285 1727203933.71274: Set connection var ansible_shell_executable to /bin/sh 19285 1727203933.71480: Set connection var ansible_connection to ssh 19285 1727203933.71483: variable 'ansible_shell_executable' from source: unknown 19285 1727203933.71486: variable 'ansible_connection' from source: unknown 19285 1727203933.71490: variable 'ansible_module_compression' from source: unknown 19285 1727203933.71492: variable 'ansible_shell_type' from source: unknown 19285 1727203933.71494: variable 'ansible_shell_executable' from source: unknown 19285 1727203933.71496: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203933.71499: variable 'ansible_pipelining' from source: unknown 19285 1727203933.71501: variable 'ansible_timeout' from source: unknown 19285 1727203933.71503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203933.71579: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19285 1727203933.71793: variable 'omit' from source: magic vars 19285 1727203933.71802: starting attempt loop 19285 1727203933.71808: running the handler 19285 1727203933.71907: _low_level_execute_command(): starting 19285 1727203933.71920: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203933.72672: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203933.72784: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203933.72815: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203933.72931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203933.74641: stdout chunk (state=3): >>>/root <<< 19285 1727203933.74781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203933.74803: stdout chunk (state=3): >>><<< 19285 1727203933.74823: stderr chunk (state=3): >>><<< 19285 1727203933.74944: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203933.74948: _low_level_execute_command(): starting 19285 1727203933.74952: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203933.7484465-21765-156978957058387 `" && echo ansible-tmp-1727203933.7484465-21765-156978957058387="` echo /root/.ansible/tmp/ansible-tmp-1727203933.7484465-21765-156978957058387 `" ) && sleep 0' 19285 1727203933.75539: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203933.75551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203933.75667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203933.77608: stdout chunk (state=3): >>>ansible-tmp-1727203933.7484465-21765-156978957058387=/root/.ansible/tmp/ansible-tmp-1727203933.7484465-21765-156978957058387 <<< 19285 1727203933.77711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203933.77798: stderr chunk (state=3): >>><<< 19285 1727203933.77802: stdout chunk (state=3): >>><<< 19285 1727203933.77819: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203933.7484465-21765-156978957058387=/root/.ansible/tmp/ansible-tmp-1727203933.7484465-21765-156978957058387 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203933.77981: variable 'ansible_module_compression' from source: unknown 19285 1727203933.77986: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 19285 1727203933.78026: variable 'ansible_facts' from source: unknown 19285 1727203933.78244: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203933.7484465-21765-156978957058387/AnsiballZ_package_facts.py 19285 1727203933.78396: Sending initial data 19285 1727203933.78439: Sent initial data (162 bytes) 19285 1727203933.79101: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203933.79198: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203933.79226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203933.79244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203933.79274: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203933.79378: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203933.81049: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203933.81128: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203933.81214: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmpsvdts2yj /root/.ansible/tmp/ansible-tmp-1727203933.7484465-21765-156978957058387/AnsiballZ_package_facts.py <<< 19285 1727203933.81217: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203933.7484465-21765-156978957058387/AnsiballZ_package_facts.py" <<< 19285 1727203933.81299: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmpsvdts2yj" to remote "/root/.ansible/tmp/ansible-tmp-1727203933.7484465-21765-156978957058387/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203933.7484465-21765-156978957058387/AnsiballZ_package_facts.py" <<< 19285 1727203933.83084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203933.83235: stderr chunk (state=3): >>><<< 19285 1727203933.83239: stdout chunk (state=3): >>><<< 19285 1727203933.83242: done transferring module to remote 19285 1727203933.83244: _low_level_execute_command(): starting 19285 1727203933.83246: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203933.7484465-21765-156978957058387/ /root/.ansible/tmp/ansible-tmp-1727203933.7484465-21765-156978957058387/AnsiballZ_package_facts.py && sleep 0' 19285 1727203933.83852: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203933.83869: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203933.83936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203933.84006: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203933.84037: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203933.84062: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203933.84178: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203933.86065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203933.86095: stderr chunk (state=3): >>><<< 19285 1727203933.86099: stdout chunk (state=3): >>><<< 19285 1727203933.86201: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203933.86205: _low_level_execute_command(): starting 19285 1727203933.86208: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203933.7484465-21765-156978957058387/AnsiballZ_package_facts.py && sleep 0' 19285 1727203933.86832: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203933.86890: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203933.86960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203933.86978: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203933.86998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203933.87108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203934.31853: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 19285 1727203934.31874: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 19285 1727203934.31929: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el1<<< 19285 1727203934.31945: stdout chunk (state=3): >>>0", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [<<< 19285 1727203934.31953: stdout chunk (state=3): >>>{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "<<< 19285 1727203934.32005: stdout chunk (state=3): >>>3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch",<<< 19285 1727203934.32016: stdout chunk (state=3): >>> "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 19285 1727203934.33780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203934.33812: stderr chunk (state=3): >>><<< 19285 1727203934.33815: stdout chunk (state=3): >>><<< 19285 1727203934.33864: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203934.35156: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203933.7484465-21765-156978957058387/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203934.35174: _low_level_execute_command(): starting 19285 1727203934.35180: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203933.7484465-21765-156978957058387/ > /dev/null 2>&1 && sleep 0' 19285 1727203934.35637: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203934.35641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 19285 1727203934.35643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203934.35646: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203934.35648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203934.35705: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203934.35708: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203934.35711: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203934.35791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203934.37695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203934.37723: stderr chunk (state=3): >>><<< 19285 1727203934.37726: stdout chunk (state=3): >>><<< 19285 1727203934.37738: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203934.37744: handler run complete 19285 1727203934.38205: variable 'ansible_facts' from source: unknown 19285 1727203934.38519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203934.39566: variable 'ansible_facts' from source: unknown 19285 1727203934.39805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203934.40185: attempt loop complete, returning result 19285 1727203934.40195: _execute() done 19285 1727203934.40198: dumping result to json 19285 1727203934.40317: done dumping result, returning 19285 1727203934.40321: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [028d2410-947f-f31b-fb3f-0000000003e8] 19285 1727203934.40326: sending task result for task 028d2410-947f-f31b-fb3f-0000000003e8 19285 1727203934.41593: done sending task result for task 028d2410-947f-f31b-fb3f-0000000003e8 19285 1727203934.41596: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19285 1727203934.41673: no more pending results, returning what we have 19285 1727203934.41677: results queue empty 19285 1727203934.41678: checking for any_errors_fatal 19285 1727203934.41682: done checking for any_errors_fatal 19285 1727203934.41682: checking for max_fail_percentage 19285 1727203934.41683: done checking for max_fail_percentage 19285 1727203934.41684: checking to see if all hosts have failed and the running result is not ok 19285 1727203934.41684: done checking to see if all hosts have failed 19285 1727203934.41685: getting the remaining hosts for this loop 19285 1727203934.41686: done getting the remaining hosts for this loop 19285 1727203934.41688: getting the next task for host managed-node2 19285 1727203934.41692: done getting next task for host managed-node2 19285 1727203934.41695: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 19285 1727203934.41696: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203934.41702: getting variables 19285 1727203934.41702: in VariableManager get_vars() 19285 1727203934.41727: Calling all_inventory to load vars for managed-node2 19285 1727203934.41729: Calling groups_inventory to load vars for managed-node2 19285 1727203934.41730: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203934.41736: Calling all_plugins_play to load vars for managed-node2 19285 1727203934.41738: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203934.41740: Calling groups_plugins_play to load vars for managed-node2 19285 1727203934.42422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203934.43307: done with get_vars() 19285 1727203934.43323: done getting variables 19285 1727203934.43372: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:52:14 -0400 (0:00:00.740) 0:00:33.508 ***** 19285 1727203934.43396: entering _queue_task() for managed-node2/debug 19285 1727203934.43645: worker is 1 (out of 1 available) 19285 1727203934.43661: exiting _queue_task() for managed-node2/debug 19285 1727203934.43672: done queuing things up, now waiting for results queue to drain 19285 1727203934.43674: waiting for pending results... 19285 1727203934.43854: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 19285 1727203934.43931: in run() - task 028d2410-947f-f31b-fb3f-00000000005b 19285 1727203934.43943: variable 'ansible_search_path' from source: unknown 19285 1727203934.43947: variable 'ansible_search_path' from source: unknown 19285 1727203934.43978: calling self._execute() 19285 1727203934.44061: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203934.44066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203934.44072: variable 'omit' from source: magic vars 19285 1727203934.44346: variable 'ansible_distribution_major_version' from source: facts 19285 1727203934.44355: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203934.44364: variable 'omit' from source: magic vars 19285 1727203934.44390: variable 'omit' from source: magic vars 19285 1727203934.44461: variable 'network_provider' from source: set_fact 19285 1727203934.44474: variable 'omit' from source: magic vars 19285 1727203934.44508: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203934.44535: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203934.44553: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203934.44582: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203934.44586: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203934.44601: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203934.44603: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203934.44608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203934.44681: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203934.44688: Set connection var ansible_pipelining to False 19285 1727203934.44693: Set connection var ansible_timeout to 10 19285 1727203934.44695: Set connection var ansible_shell_type to sh 19285 1727203934.44701: Set connection var ansible_shell_executable to /bin/sh 19285 1727203934.44705: Set connection var ansible_connection to ssh 19285 1727203934.44721: variable 'ansible_shell_executable' from source: unknown 19285 1727203934.44724: variable 'ansible_connection' from source: unknown 19285 1727203934.44726: variable 'ansible_module_compression' from source: unknown 19285 1727203934.44729: variable 'ansible_shell_type' from source: unknown 19285 1727203934.44731: variable 'ansible_shell_executable' from source: unknown 19285 1727203934.44733: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203934.44737: variable 'ansible_pipelining' from source: unknown 19285 1727203934.44740: variable 'ansible_timeout' from source: unknown 19285 1727203934.44744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203934.44844: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203934.44852: variable 'omit' from source: magic vars 19285 1727203934.44855: starting attempt loop 19285 1727203934.44858: running the handler 19285 1727203934.44898: handler run complete 19285 1727203934.44908: attempt loop complete, returning result 19285 1727203934.44911: _execute() done 19285 1727203934.44914: dumping result to json 19285 1727203934.44916: done dumping result, returning 19285 1727203934.44923: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [028d2410-947f-f31b-fb3f-00000000005b] 19285 1727203934.44928: sending task result for task 028d2410-947f-f31b-fb3f-00000000005b 19285 1727203934.45011: done sending task result for task 028d2410-947f-f31b-fb3f-00000000005b 19285 1727203934.45014: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 19285 1727203934.45068: no more pending results, returning what we have 19285 1727203934.45071: results queue empty 19285 1727203934.45072: checking for any_errors_fatal 19285 1727203934.45084: done checking for any_errors_fatal 19285 1727203934.45085: checking for max_fail_percentage 19285 1727203934.45087: done checking for max_fail_percentage 19285 1727203934.45088: checking to see if all hosts have failed and the running result is not ok 19285 1727203934.45089: done checking to see if all hosts have failed 19285 1727203934.45089: getting the remaining hosts for this loop 19285 1727203934.45091: done getting the remaining hosts for this loop 19285 1727203934.45095: getting the next task for host managed-node2 19285 1727203934.45101: done getting next task for host managed-node2 19285 1727203934.45104: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 19285 1727203934.45106: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203934.45115: getting variables 19285 1727203934.45117: in VariableManager get_vars() 19285 1727203934.45149: Calling all_inventory to load vars for managed-node2 19285 1727203934.45152: Calling groups_inventory to load vars for managed-node2 19285 1727203934.45154: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203934.45164: Calling all_plugins_play to load vars for managed-node2 19285 1727203934.45167: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203934.45169: Calling groups_plugins_play to load vars for managed-node2 19285 1727203934.46017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203934.46904: done with get_vars() 19285 1727203934.46922: done getting variables 19285 1727203934.46964: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:52:14 -0400 (0:00:00.035) 0:00:33.544 ***** 19285 1727203934.46986: entering _queue_task() for managed-node2/fail 19285 1727203934.47213: worker is 1 (out of 1 available) 19285 1727203934.47227: exiting _queue_task() for managed-node2/fail 19285 1727203934.47239: done queuing things up, now waiting for results queue to drain 19285 1727203934.47241: waiting for pending results... 19285 1727203934.47413: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 19285 1727203934.47489: in run() - task 028d2410-947f-f31b-fb3f-00000000005c 19285 1727203934.47500: variable 'ansible_search_path' from source: unknown 19285 1727203934.47503: variable 'ansible_search_path' from source: unknown 19285 1727203934.47532: calling self._execute() 19285 1727203934.47609: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203934.47613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203934.47621: variable 'omit' from source: magic vars 19285 1727203934.47880: variable 'ansible_distribution_major_version' from source: facts 19285 1727203934.47889: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203934.47972: variable 'network_state' from source: role '' defaults 19285 1727203934.47983: Evaluated conditional (network_state != {}): False 19285 1727203934.47986: when evaluation is False, skipping this task 19285 1727203934.47989: _execute() done 19285 1727203934.47992: dumping result to json 19285 1727203934.47995: done dumping result, returning 19285 1727203934.48002: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [028d2410-947f-f31b-fb3f-00000000005c] 19285 1727203934.48005: sending task result for task 028d2410-947f-f31b-fb3f-00000000005c 19285 1727203934.48094: done sending task result for task 028d2410-947f-f31b-fb3f-00000000005c 19285 1727203934.48097: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19285 1727203934.48157: no more pending results, returning what we have 19285 1727203934.48163: results queue empty 19285 1727203934.48165: checking for any_errors_fatal 19285 1727203934.48170: done checking for any_errors_fatal 19285 1727203934.48171: checking for max_fail_percentage 19285 1727203934.48173: done checking for max_fail_percentage 19285 1727203934.48173: checking to see if all hosts have failed and the running result is not ok 19285 1727203934.48174: done checking to see if all hosts have failed 19285 1727203934.48177: getting the remaining hosts for this loop 19285 1727203934.48178: done getting the remaining hosts for this loop 19285 1727203934.48181: getting the next task for host managed-node2 19285 1727203934.48186: done getting next task for host managed-node2 19285 1727203934.48189: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 19285 1727203934.48191: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203934.48203: getting variables 19285 1727203934.48204: in VariableManager get_vars() 19285 1727203934.48233: Calling all_inventory to load vars for managed-node2 19285 1727203934.48235: Calling groups_inventory to load vars for managed-node2 19285 1727203934.48238: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203934.48245: Calling all_plugins_play to load vars for managed-node2 19285 1727203934.48247: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203934.48250: Calling groups_plugins_play to load vars for managed-node2 19285 1727203934.49319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203934.50377: done with get_vars() 19285 1727203934.50399: done getting variables 19285 1727203934.50441: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:52:14 -0400 (0:00:00.034) 0:00:33.579 ***** 19285 1727203934.50466: entering _queue_task() for managed-node2/fail 19285 1727203934.50721: worker is 1 (out of 1 available) 19285 1727203934.50737: exiting _queue_task() for managed-node2/fail 19285 1727203934.50749: done queuing things up, now waiting for results queue to drain 19285 1727203934.50750: waiting for pending results... 19285 1727203934.50930: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 19285 1727203934.51006: in run() - task 028d2410-947f-f31b-fb3f-00000000005d 19285 1727203934.51018: variable 'ansible_search_path' from source: unknown 19285 1727203934.51022: variable 'ansible_search_path' from source: unknown 19285 1727203934.51050: calling self._execute() 19285 1727203934.51132: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203934.51137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203934.51145: variable 'omit' from source: magic vars 19285 1727203934.51417: variable 'ansible_distribution_major_version' from source: facts 19285 1727203934.51424: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203934.51543: variable 'network_state' from source: role '' defaults 19285 1727203934.51552: Evaluated conditional (network_state != {}): False 19285 1727203934.51555: when evaluation is False, skipping this task 19285 1727203934.51558: _execute() done 19285 1727203934.51560: dumping result to json 19285 1727203934.51566: done dumping result, returning 19285 1727203934.51579: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [028d2410-947f-f31b-fb3f-00000000005d] 19285 1727203934.51591: sending task result for task 028d2410-947f-f31b-fb3f-00000000005d 19285 1727203934.51679: done sending task result for task 028d2410-947f-f31b-fb3f-00000000005d 19285 1727203934.51682: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19285 1727203934.51724: no more pending results, returning what we have 19285 1727203934.51728: results queue empty 19285 1727203934.51729: checking for any_errors_fatal 19285 1727203934.51737: done checking for any_errors_fatal 19285 1727203934.51738: checking for max_fail_percentage 19285 1727203934.51740: done checking for max_fail_percentage 19285 1727203934.51741: checking to see if all hosts have failed and the running result is not ok 19285 1727203934.51742: done checking to see if all hosts have failed 19285 1727203934.51742: getting the remaining hosts for this loop 19285 1727203934.51744: done getting the remaining hosts for this loop 19285 1727203934.51747: getting the next task for host managed-node2 19285 1727203934.51753: done getting next task for host managed-node2 19285 1727203934.51757: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 19285 1727203934.51759: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203934.51773: getting variables 19285 1727203934.51775: in VariableManager get_vars() 19285 1727203934.51812: Calling all_inventory to load vars for managed-node2 19285 1727203934.51815: Calling groups_inventory to load vars for managed-node2 19285 1727203934.51817: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203934.51825: Calling all_plugins_play to load vars for managed-node2 19285 1727203934.51828: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203934.51830: Calling groups_plugins_play to load vars for managed-node2 19285 1727203934.53068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203934.54083: done with get_vars() 19285 1727203934.54101: done getting variables 19285 1727203934.54143: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:52:14 -0400 (0:00:00.036) 0:00:33.616 ***** 19285 1727203934.54167: entering _queue_task() for managed-node2/fail 19285 1727203934.54405: worker is 1 (out of 1 available) 19285 1727203934.54419: exiting _queue_task() for managed-node2/fail 19285 1727203934.54431: done queuing things up, now waiting for results queue to drain 19285 1727203934.54432: waiting for pending results... 19285 1727203934.54607: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 19285 1727203934.54683: in run() - task 028d2410-947f-f31b-fb3f-00000000005e 19285 1727203934.54695: variable 'ansible_search_path' from source: unknown 19285 1727203934.54699: variable 'ansible_search_path' from source: unknown 19285 1727203934.54727: calling self._execute() 19285 1727203934.54810: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203934.54815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203934.54824: variable 'omit' from source: magic vars 19285 1727203934.55101: variable 'ansible_distribution_major_version' from source: facts 19285 1727203934.55105: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203934.55225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203934.57545: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203934.57601: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203934.57641: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203934.57764: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203934.57768: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203934.57805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203934.57849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203934.57890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203934.57935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203934.57954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203934.58057: variable 'ansible_distribution_major_version' from source: facts 19285 1727203934.58089: Evaluated conditional (ansible_distribution_major_version | int > 9): True 19285 1727203934.58207: variable 'ansible_distribution' from source: facts 19285 1727203934.58216: variable '__network_rh_distros' from source: role '' defaults 19285 1727203934.58229: Evaluated conditional (ansible_distribution in __network_rh_distros): True 19285 1727203934.58502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203934.58541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203934.58629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203934.58632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203934.58647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203934.58702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203934.58734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203934.58769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203934.58813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203934.58830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203934.58885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203934.58956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203934.58962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203934.58988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203934.59005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203934.59330: variable 'network_connections' from source: play vars 19285 1727203934.59346: variable 'profile' from source: play vars 19285 1727203934.59424: variable 'profile' from source: play vars 19285 1727203934.59434: variable 'interface' from source: set_fact 19285 1727203934.59580: variable 'interface' from source: set_fact 19285 1727203934.59583: variable 'network_state' from source: role '' defaults 19285 1727203934.59591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203934.59773: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203934.59817: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203934.59862: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203934.59900: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203934.59955: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203934.59997: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203934.60030: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203934.60071: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203934.60156: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 19285 1727203934.60162: when evaluation is False, skipping this task 19285 1727203934.60164: _execute() done 19285 1727203934.60167: dumping result to json 19285 1727203934.60169: done dumping result, returning 19285 1727203934.60171: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [028d2410-947f-f31b-fb3f-00000000005e] 19285 1727203934.60173: sending task result for task 028d2410-947f-f31b-fb3f-00000000005e skipping: [managed-node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 19285 1727203934.60309: no more pending results, returning what we have 19285 1727203934.60312: results queue empty 19285 1727203934.60313: checking for any_errors_fatal 19285 1727203934.60321: done checking for any_errors_fatal 19285 1727203934.60322: checking for max_fail_percentage 19285 1727203934.60323: done checking for max_fail_percentage 19285 1727203934.60324: checking to see if all hosts have failed and the running result is not ok 19285 1727203934.60325: done checking to see if all hosts have failed 19285 1727203934.60326: getting the remaining hosts for this loop 19285 1727203934.60328: done getting the remaining hosts for this loop 19285 1727203934.60332: getting the next task for host managed-node2 19285 1727203934.60339: done getting next task for host managed-node2 19285 1727203934.60343: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 19285 1727203934.60345: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203934.60358: getting variables 19285 1727203934.60362: in VariableManager get_vars() 19285 1727203934.60405: Calling all_inventory to load vars for managed-node2 19285 1727203934.60409: Calling groups_inventory to load vars for managed-node2 19285 1727203934.60412: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203934.60423: Calling all_plugins_play to load vars for managed-node2 19285 1727203934.60426: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203934.60429: Calling groups_plugins_play to load vars for managed-node2 19285 1727203934.61195: done sending task result for task 028d2410-947f-f31b-fb3f-00000000005e 19285 1727203934.61199: WORKER PROCESS EXITING 19285 1727203934.62188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203934.63896: done with get_vars() 19285 1727203934.63921: done getting variables 19285 1727203934.63994: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:52:14 -0400 (0:00:00.098) 0:00:33.714 ***** 19285 1727203934.64027: entering _queue_task() for managed-node2/dnf 19285 1727203934.64498: worker is 1 (out of 1 available) 19285 1727203934.64510: exiting _queue_task() for managed-node2/dnf 19285 1727203934.64519: done queuing things up, now waiting for results queue to drain 19285 1727203934.64520: waiting for pending results... 19285 1727203934.64727: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 19285 1727203934.64852: in run() - task 028d2410-947f-f31b-fb3f-00000000005f 19285 1727203934.64881: variable 'ansible_search_path' from source: unknown 19285 1727203934.64890: variable 'ansible_search_path' from source: unknown 19285 1727203934.64938: calling self._execute() 19285 1727203934.65061: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203934.65081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203934.65098: variable 'omit' from source: magic vars 19285 1727203934.65518: variable 'ansible_distribution_major_version' from source: facts 19285 1727203934.65536: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203934.65755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203934.68248: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203934.68310: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203934.68346: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203934.68380: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203934.68407: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203934.68483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203934.68523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203934.68548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203934.68729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203934.68733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203934.68824: variable 'ansible_distribution' from source: facts 19285 1727203934.68828: variable 'ansible_distribution_major_version' from source: facts 19285 1727203934.68842: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 19285 1727203934.69045: variable '__network_wireless_connections_defined' from source: role '' defaults 19285 1727203934.69127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203934.69151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203934.69178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203934.69217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203934.69232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203934.69305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203934.69309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203934.69319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203934.69580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203934.69584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203934.69587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203934.69589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203934.69592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203934.69594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203934.69596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203934.69665: variable 'network_connections' from source: play vars 19285 1727203934.69674: variable 'profile' from source: play vars 19285 1727203934.69736: variable 'profile' from source: play vars 19285 1727203934.69739: variable 'interface' from source: set_fact 19285 1727203934.69962: variable 'interface' from source: set_fact 19285 1727203934.70043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203934.70448: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203934.70451: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203934.70454: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203934.70456: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203934.70612: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203934.70615: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203934.70625: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203934.70635: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203934.70887: variable '__network_team_connections_defined' from source: role '' defaults 19285 1727203934.71364: variable 'network_connections' from source: play vars 19285 1727203934.71368: variable 'profile' from source: play vars 19285 1727203934.71478: variable 'profile' from source: play vars 19285 1727203934.71595: variable 'interface' from source: set_fact 19285 1727203934.71643: variable 'interface' from source: set_fact 19285 1727203934.71671: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 19285 1727203934.71697: when evaluation is False, skipping this task 19285 1727203934.71705: _execute() done 19285 1727203934.71708: dumping result to json 19285 1727203934.71710: done dumping result, returning 19285 1727203934.71766: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [028d2410-947f-f31b-fb3f-00000000005f] 19285 1727203934.71769: sending task result for task 028d2410-947f-f31b-fb3f-00000000005f skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 19285 1727203934.71972: no more pending results, returning what we have 19285 1727203934.71978: results queue empty 19285 1727203934.71979: checking for any_errors_fatal 19285 1727203934.71986: done checking for any_errors_fatal 19285 1727203934.71987: checking for max_fail_percentage 19285 1727203934.71988: done checking for max_fail_percentage 19285 1727203934.71989: checking to see if all hosts have failed and the running result is not ok 19285 1727203934.71990: done checking to see if all hosts have failed 19285 1727203934.71990: getting the remaining hosts for this loop 19285 1727203934.71996: done getting the remaining hosts for this loop 19285 1727203934.72000: getting the next task for host managed-node2 19285 1727203934.72024: done getting next task for host managed-node2 19285 1727203934.72029: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 19285 1727203934.72031: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203934.72045: getting variables 19285 1727203934.72047: in VariableManager get_vars() 19285 1727203934.72092: Calling all_inventory to load vars for managed-node2 19285 1727203934.72096: Calling groups_inventory to load vars for managed-node2 19285 1727203934.72099: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203934.72283: done sending task result for task 028d2410-947f-f31b-fb3f-00000000005f 19285 1727203934.72286: WORKER PROCESS EXITING 19285 1727203934.72295: Calling all_plugins_play to load vars for managed-node2 19285 1727203934.72298: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203934.72302: Calling groups_plugins_play to load vars for managed-node2 19285 1727203934.74981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203934.78736: done with get_vars() 19285 1727203934.78771: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 19285 1727203934.78970: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:52:14 -0400 (0:00:00.149) 0:00:33.864 ***** 19285 1727203934.79005: entering _queue_task() for managed-node2/yum 19285 1727203934.79594: worker is 1 (out of 1 available) 19285 1727203934.79610: exiting _queue_task() for managed-node2/yum 19285 1727203934.79621: done queuing things up, now waiting for results queue to drain 19285 1727203934.79623: waiting for pending results... 19285 1727203934.79846: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 19285 1727203934.79983: in run() - task 028d2410-947f-f31b-fb3f-000000000060 19285 1727203934.80048: variable 'ansible_search_path' from source: unknown 19285 1727203934.80052: variable 'ansible_search_path' from source: unknown 19285 1727203934.80057: calling self._execute() 19285 1727203934.80229: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203934.80243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203934.80279: variable 'omit' from source: magic vars 19285 1727203934.80794: variable 'ansible_distribution_major_version' from source: facts 19285 1727203934.80853: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203934.81014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203934.84785: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203934.84789: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203934.84930: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203934.85036: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203934.85088: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203934.85433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203934.85438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203934.85441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203934.85667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203934.85670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203934.85789: variable 'ansible_distribution_major_version' from source: facts 19285 1727203934.85810: Evaluated conditional (ansible_distribution_major_version | int < 8): False 19285 1727203934.85818: when evaluation is False, skipping this task 19285 1727203934.85978: _execute() done 19285 1727203934.85982: dumping result to json 19285 1727203934.85985: done dumping result, returning 19285 1727203934.85987: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [028d2410-947f-f31b-fb3f-000000000060] 19285 1727203934.85990: sending task result for task 028d2410-947f-f31b-fb3f-000000000060 19285 1727203934.86053: done sending task result for task 028d2410-947f-f31b-fb3f-000000000060 19285 1727203934.86056: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 19285 1727203934.86111: no more pending results, returning what we have 19285 1727203934.86116: results queue empty 19285 1727203934.86117: checking for any_errors_fatal 19285 1727203934.86125: done checking for any_errors_fatal 19285 1727203934.86126: checking for max_fail_percentage 19285 1727203934.86128: done checking for max_fail_percentage 19285 1727203934.86129: checking to see if all hosts have failed and the running result is not ok 19285 1727203934.86130: done checking to see if all hosts have failed 19285 1727203934.86131: getting the remaining hosts for this loop 19285 1727203934.86133: done getting the remaining hosts for this loop 19285 1727203934.86136: getting the next task for host managed-node2 19285 1727203934.86144: done getting next task for host managed-node2 19285 1727203934.86149: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 19285 1727203934.86151: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203934.86169: getting variables 19285 1727203934.86171: in VariableManager get_vars() 19285 1727203934.86216: Calling all_inventory to load vars for managed-node2 19285 1727203934.86219: Calling groups_inventory to load vars for managed-node2 19285 1727203934.86222: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203934.86233: Calling all_plugins_play to load vars for managed-node2 19285 1727203934.86237: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203934.86240: Calling groups_plugins_play to load vars for managed-node2 19285 1727203934.90291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203934.93457: done with get_vars() 19285 1727203934.93690: done getting variables 19285 1727203934.93748: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:52:14 -0400 (0:00:00.147) 0:00:34.012 ***** 19285 1727203934.93785: entering _queue_task() for managed-node2/fail 19285 1727203934.94538: worker is 1 (out of 1 available) 19285 1727203934.94551: exiting _queue_task() for managed-node2/fail 19285 1727203934.94567: done queuing things up, now waiting for results queue to drain 19285 1727203934.94568: waiting for pending results... 19285 1727203934.95217: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 19285 1727203934.95238: in run() - task 028d2410-947f-f31b-fb3f-000000000061 19285 1727203934.95252: variable 'ansible_search_path' from source: unknown 19285 1727203934.95257: variable 'ansible_search_path' from source: unknown 19285 1727203934.95400: calling self._execute() 19285 1727203934.95530: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203934.95535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203934.95538: variable 'omit' from source: magic vars 19285 1727203934.96605: variable 'ansible_distribution_major_version' from source: facts 19285 1727203934.96616: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203934.96733: variable '__network_wireless_connections_defined' from source: role '' defaults 19285 1727203934.97164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203934.99942: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203935.00009: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203935.00045: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203935.00080: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203935.00216: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203935.00443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203935.00526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203935.00530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203935.00556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203935.00571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203935.00837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203935.00862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203935.01080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203935.01083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203935.01086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203935.01089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203935.01092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203935.01096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203935.01281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203935.01298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203935.01495: variable 'network_connections' from source: play vars 19285 1727203935.01498: variable 'profile' from source: play vars 19285 1727203935.01559: variable 'profile' from source: play vars 19285 1727203935.01642: variable 'interface' from source: set_fact 19285 1727203935.01704: variable 'interface' from source: set_fact 19285 1727203935.01772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203935.02003: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203935.02045: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203935.02084: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203935.02113: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203935.02153: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203935.02178: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203935.02372: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203935.02377: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203935.02457: variable '__network_team_connections_defined' from source: role '' defaults 19285 1727203935.02984: variable 'network_connections' from source: play vars 19285 1727203935.03403: variable 'profile' from source: play vars 19285 1727203935.03782: variable 'profile' from source: play vars 19285 1727203935.03785: variable 'interface' from source: set_fact 19285 1727203935.03788: variable 'interface' from source: set_fact 19285 1727203935.03790: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 19285 1727203935.03792: when evaluation is False, skipping this task 19285 1727203935.03794: _execute() done 19285 1727203935.03887: dumping result to json 19285 1727203935.03899: done dumping result, returning 19285 1727203935.03916: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [028d2410-947f-f31b-fb3f-000000000061] 19285 1727203935.03934: sending task result for task 028d2410-947f-f31b-fb3f-000000000061 19285 1727203935.04200: done sending task result for task 028d2410-947f-f31b-fb3f-000000000061 19285 1727203935.04204: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 19285 1727203935.04278: no more pending results, returning what we have 19285 1727203935.04283: results queue empty 19285 1727203935.04284: checking for any_errors_fatal 19285 1727203935.04292: done checking for any_errors_fatal 19285 1727203935.04292: checking for max_fail_percentage 19285 1727203935.04294: done checking for max_fail_percentage 19285 1727203935.04295: checking to see if all hosts have failed and the running result is not ok 19285 1727203935.04296: done checking to see if all hosts have failed 19285 1727203935.04297: getting the remaining hosts for this loop 19285 1727203935.04299: done getting the remaining hosts for this loop 19285 1727203935.04303: getting the next task for host managed-node2 19285 1727203935.04311: done getting next task for host managed-node2 19285 1727203935.04316: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 19285 1727203935.04318: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203935.04334: getting variables 19285 1727203935.04336: in VariableManager get_vars() 19285 1727203935.04589: Calling all_inventory to load vars for managed-node2 19285 1727203935.04592: Calling groups_inventory to load vars for managed-node2 19285 1727203935.04595: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203935.04607: Calling all_plugins_play to load vars for managed-node2 19285 1727203935.04610: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203935.04614: Calling groups_plugins_play to load vars for managed-node2 19285 1727203935.09017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203935.12464: done with get_vars() 19285 1727203935.12503: done getting variables 19285 1727203935.12565: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:52:15 -0400 (0:00:00.188) 0:00:34.201 ***** 19285 1727203935.12667: entering _queue_task() for managed-node2/package 19285 1727203935.13469: worker is 1 (out of 1 available) 19285 1727203935.13483: exiting _queue_task() for managed-node2/package 19285 1727203935.13495: done queuing things up, now waiting for results queue to drain 19285 1727203935.13496: waiting for pending results... 19285 1727203935.13995: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 19285 1727203935.14000: in run() - task 028d2410-947f-f31b-fb3f-000000000062 19285 1727203935.14003: variable 'ansible_search_path' from source: unknown 19285 1727203935.14005: variable 'ansible_search_path' from source: unknown 19285 1727203935.14008: calling self._execute() 19285 1727203935.14039: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203935.14056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203935.14074: variable 'omit' from source: magic vars 19285 1727203935.14844: variable 'ansible_distribution_major_version' from source: facts 19285 1727203935.14896: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203935.15432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203935.15597: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203935.15648: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203935.15690: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203935.15771: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203935.15894: variable 'network_packages' from source: role '' defaults 19285 1727203935.16007: variable '__network_provider_setup' from source: role '' defaults 19285 1727203935.16023: variable '__network_service_name_default_nm' from source: role '' defaults 19285 1727203935.16098: variable '__network_service_name_default_nm' from source: role '' defaults 19285 1727203935.16115: variable '__network_packages_default_nm' from source: role '' defaults 19285 1727203935.16194: variable '__network_packages_default_nm' from source: role '' defaults 19285 1727203935.16387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203935.18807: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203935.18880: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203935.18934: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203935.18974: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203935.19021: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203935.19116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203935.19157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203935.19190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203935.19244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203935.19264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203935.19315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203935.19347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203935.19378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203935.19423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203935.19445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203935.19992: variable '__network_packages_default_gobject_packages' from source: role '' defaults 19285 1727203935.20028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203935.20063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203935.20280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203935.20283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203935.20286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203935.20480: variable 'ansible_python' from source: facts 19285 1727203935.20544: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 19285 1727203935.20724: variable '__network_wpa_supplicant_required' from source: role '' defaults 19285 1727203935.20881: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 19285 1727203935.21581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203935.21586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203935.21589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203935.21813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203935.21816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203935.21822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203935.22080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203935.22084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203935.22354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203935.22357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203935.22981: variable 'network_connections' from source: play vars 19285 1727203935.22984: variable 'profile' from source: play vars 19285 1727203935.23156: variable 'profile' from source: play vars 19285 1727203935.23234: variable 'interface' from source: set_fact 19285 1727203935.23313: variable 'interface' from source: set_fact 19285 1727203935.23666: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203935.23799: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203935.23834: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203935.24182: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203935.24186: variable '__network_wireless_connections_defined' from source: role '' defaults 19285 1727203935.24995: variable 'network_connections' from source: play vars 19285 1727203935.25058: variable 'profile' from source: play vars 19285 1727203935.25382: variable 'profile' from source: play vars 19285 1727203935.25394: variable 'interface' from source: set_fact 19285 1727203935.25466: variable 'interface' from source: set_fact 19285 1727203935.25625: variable '__network_packages_default_wireless' from source: role '' defaults 19285 1727203935.25747: variable '__network_wireless_connections_defined' from source: role '' defaults 19285 1727203935.26462: variable 'network_connections' from source: play vars 19285 1727203935.26571: variable 'profile' from source: play vars 19285 1727203935.26644: variable 'profile' from source: play vars 19285 1727203935.26656: variable 'interface' from source: set_fact 19285 1727203935.26804: variable 'interface' from source: set_fact 19285 1727203935.26837: variable '__network_packages_default_team' from source: role '' defaults 19285 1727203935.26933: variable '__network_team_connections_defined' from source: role '' defaults 19285 1727203935.27273: variable 'network_connections' from source: play vars 19285 1727203935.27291: variable 'profile' from source: play vars 19285 1727203935.27368: variable 'profile' from source: play vars 19285 1727203935.27386: variable 'interface' from source: set_fact 19285 1727203935.27497: variable 'interface' from source: set_fact 19285 1727203935.27574: variable '__network_service_name_default_initscripts' from source: role '' defaults 19285 1727203935.27642: variable '__network_service_name_default_initscripts' from source: role '' defaults 19285 1727203935.27663: variable '__network_packages_default_initscripts' from source: role '' defaults 19285 1727203935.27733: variable '__network_packages_default_initscripts' from source: role '' defaults 19285 1727203935.27986: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 19285 1727203935.28530: variable 'network_connections' from source: play vars 19285 1727203935.28543: variable 'profile' from source: play vars 19285 1727203935.28607: variable 'profile' from source: play vars 19285 1727203935.28617: variable 'interface' from source: set_fact 19285 1727203935.28693: variable 'interface' from source: set_fact 19285 1727203935.28707: variable 'ansible_distribution' from source: facts 19285 1727203935.28718: variable '__network_rh_distros' from source: role '' defaults 19285 1727203935.28736: variable 'ansible_distribution_major_version' from source: facts 19285 1727203935.28763: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 19285 1727203935.28943: variable 'ansible_distribution' from source: facts 19285 1727203935.28951: variable '__network_rh_distros' from source: role '' defaults 19285 1727203935.28965: variable 'ansible_distribution_major_version' from source: facts 19285 1727203935.28985: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 19285 1727203935.29179: variable 'ansible_distribution' from source: facts 19285 1727203935.29182: variable '__network_rh_distros' from source: role '' defaults 19285 1727203935.29184: variable 'ansible_distribution_major_version' from source: facts 19285 1727203935.29209: variable 'network_provider' from source: set_fact 19285 1727203935.29229: variable 'ansible_facts' from source: unknown 19285 1727203935.29933: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 19285 1727203935.29936: when evaluation is False, skipping this task 19285 1727203935.29938: _execute() done 19285 1727203935.29940: dumping result to json 19285 1727203935.29941: done dumping result, returning 19285 1727203935.29944: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [028d2410-947f-f31b-fb3f-000000000062] 19285 1727203935.29945: sending task result for task 028d2410-947f-f31b-fb3f-000000000062 skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 19285 1727203935.30097: no more pending results, returning what we have 19285 1727203935.30102: results queue empty 19285 1727203935.30103: checking for any_errors_fatal 19285 1727203935.30113: done checking for any_errors_fatal 19285 1727203935.30114: checking for max_fail_percentage 19285 1727203935.30116: done checking for max_fail_percentage 19285 1727203935.30117: checking to see if all hosts have failed and the running result is not ok 19285 1727203935.30118: done checking to see if all hosts have failed 19285 1727203935.30119: getting the remaining hosts for this loop 19285 1727203935.30120: done getting the remaining hosts for this loop 19285 1727203935.30124: getting the next task for host managed-node2 19285 1727203935.30132: done getting next task for host managed-node2 19285 1727203935.30136: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 19285 1727203935.30139: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203935.30153: getting variables 19285 1727203935.30155: in VariableManager get_vars() 19285 1727203935.30196: Calling all_inventory to load vars for managed-node2 19285 1727203935.30200: Calling groups_inventory to load vars for managed-node2 19285 1727203935.30202: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203935.30215: Calling all_plugins_play to load vars for managed-node2 19285 1727203935.30224: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203935.30227: Calling groups_plugins_play to load vars for managed-node2 19285 1727203935.31189: done sending task result for task 028d2410-947f-f31b-fb3f-000000000062 19285 1727203935.31192: WORKER PROCESS EXITING 19285 1727203935.32125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203935.33685: done with get_vars() 19285 1727203935.33712: done getting variables 19285 1727203935.33786: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:52:15 -0400 (0:00:00.211) 0:00:34.412 ***** 19285 1727203935.33818: entering _queue_task() for managed-node2/package 19285 1727203935.34199: worker is 1 (out of 1 available) 19285 1727203935.34215: exiting _queue_task() for managed-node2/package 19285 1727203935.34228: done queuing things up, now waiting for results queue to drain 19285 1727203935.34230: waiting for pending results... 19285 1727203935.34518: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 19285 1727203935.34630: in run() - task 028d2410-947f-f31b-fb3f-000000000063 19285 1727203935.34649: variable 'ansible_search_path' from source: unknown 19285 1727203935.34658: variable 'ansible_search_path' from source: unknown 19285 1727203935.34705: calling self._execute() 19285 1727203935.34829: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203935.34849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203935.34866: variable 'omit' from source: magic vars 19285 1727203935.35290: variable 'ansible_distribution_major_version' from source: facts 19285 1727203935.35308: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203935.35434: variable 'network_state' from source: role '' defaults 19285 1727203935.35449: Evaluated conditional (network_state != {}): False 19285 1727203935.35463: when evaluation is False, skipping this task 19285 1727203935.35472: _execute() done 19285 1727203935.35482: dumping result to json 19285 1727203935.35489: done dumping result, returning 19285 1727203935.35501: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [028d2410-947f-f31b-fb3f-000000000063] 19285 1727203935.35512: sending task result for task 028d2410-947f-f31b-fb3f-000000000063 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19285 1727203935.35674: no more pending results, returning what we have 19285 1727203935.35680: results queue empty 19285 1727203935.35681: checking for any_errors_fatal 19285 1727203935.35691: done checking for any_errors_fatal 19285 1727203935.35691: checking for max_fail_percentage 19285 1727203935.35693: done checking for max_fail_percentage 19285 1727203935.35694: checking to see if all hosts have failed and the running result is not ok 19285 1727203935.35695: done checking to see if all hosts have failed 19285 1727203935.35696: getting the remaining hosts for this loop 19285 1727203935.35698: done getting the remaining hosts for this loop 19285 1727203935.35703: getting the next task for host managed-node2 19285 1727203935.35711: done getting next task for host managed-node2 19285 1727203935.35715: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 19285 1727203935.35717: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203935.35734: getting variables 19285 1727203935.35736: in VariableManager get_vars() 19285 1727203935.35879: Calling all_inventory to load vars for managed-node2 19285 1727203935.35887: Calling groups_inventory to load vars for managed-node2 19285 1727203935.35891: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203935.36085: Calling all_plugins_play to load vars for managed-node2 19285 1727203935.36089: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203935.36093: Calling groups_plugins_play to load vars for managed-node2 19285 1727203935.36791: done sending task result for task 028d2410-947f-f31b-fb3f-000000000063 19285 1727203935.36795: WORKER PROCESS EXITING 19285 1727203935.37546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203935.39163: done with get_vars() 19285 1727203935.39193: done getting variables 19285 1727203935.39258: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:52:15 -0400 (0:00:00.054) 0:00:34.467 ***** 19285 1727203935.39297: entering _queue_task() for managed-node2/package 19285 1727203935.39647: worker is 1 (out of 1 available) 19285 1727203935.39660: exiting _queue_task() for managed-node2/package 19285 1727203935.39672: done queuing things up, now waiting for results queue to drain 19285 1727203935.39673: waiting for pending results... 19285 1727203935.40042: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 19285 1727203935.40165: in run() - task 028d2410-947f-f31b-fb3f-000000000064 19285 1727203935.40197: variable 'ansible_search_path' from source: unknown 19285 1727203935.40207: variable 'ansible_search_path' from source: unknown 19285 1727203935.40247: calling self._execute() 19285 1727203935.40360: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203935.40372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203935.40393: variable 'omit' from source: magic vars 19285 1727203935.40778: variable 'ansible_distribution_major_version' from source: facts 19285 1727203935.40802: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203935.40943: variable 'network_state' from source: role '' defaults 19285 1727203935.40958: Evaluated conditional (network_state != {}): False 19285 1727203935.40966: when evaluation is False, skipping this task 19285 1727203935.40977: _execute() done 19285 1727203935.40990: dumping result to json 19285 1727203935.41000: done dumping result, returning 19285 1727203935.41011: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [028d2410-947f-f31b-fb3f-000000000064] 19285 1727203935.41021: sending task result for task 028d2410-947f-f31b-fb3f-000000000064 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19285 1727203935.41191: no more pending results, returning what we have 19285 1727203935.41196: results queue empty 19285 1727203935.41197: checking for any_errors_fatal 19285 1727203935.41207: done checking for any_errors_fatal 19285 1727203935.41208: checking for max_fail_percentage 19285 1727203935.41210: done checking for max_fail_percentage 19285 1727203935.41211: checking to see if all hosts have failed and the running result is not ok 19285 1727203935.41212: done checking to see if all hosts have failed 19285 1727203935.41213: getting the remaining hosts for this loop 19285 1727203935.41215: done getting the remaining hosts for this loop 19285 1727203935.41219: getting the next task for host managed-node2 19285 1727203935.41226: done getting next task for host managed-node2 19285 1727203935.41230: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 19285 1727203935.41233: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203935.41248: getting variables 19285 1727203935.41249: in VariableManager get_vars() 19285 1727203935.41295: Calling all_inventory to load vars for managed-node2 19285 1727203935.41299: Calling groups_inventory to load vars for managed-node2 19285 1727203935.41302: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203935.41315: Calling all_plugins_play to load vars for managed-node2 19285 1727203935.41318: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203935.41322: Calling groups_plugins_play to load vars for managed-node2 19285 1727203935.42148: done sending task result for task 028d2410-947f-f31b-fb3f-000000000064 19285 1727203935.42151: WORKER PROCESS EXITING 19285 1727203935.43028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203935.44699: done with get_vars() 19285 1727203935.44722: done getting variables 19285 1727203935.44790: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:52:15 -0400 (0:00:00.055) 0:00:34.522 ***** 19285 1727203935.44827: entering _queue_task() for managed-node2/service 19285 1727203935.45478: worker is 1 (out of 1 available) 19285 1727203935.45488: exiting _queue_task() for managed-node2/service 19285 1727203935.45498: done queuing things up, now waiting for results queue to drain 19285 1727203935.45499: waiting for pending results... 19285 1727203935.45867: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 19285 1727203935.46198: in run() - task 028d2410-947f-f31b-fb3f-000000000065 19285 1727203935.46278: variable 'ansible_search_path' from source: unknown 19285 1727203935.46288: variable 'ansible_search_path' from source: unknown 19285 1727203935.46339: calling self._execute() 19285 1727203935.46641: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203935.46659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203935.46699: variable 'omit' from source: magic vars 19285 1727203935.47520: variable 'ansible_distribution_major_version' from source: facts 19285 1727203935.47702: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203935.47802: variable '__network_wireless_connections_defined' from source: role '' defaults 19285 1727203935.48093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203935.53012: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203935.53384: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203935.53389: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203935.53391: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203935.53393: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203935.53541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203935.53648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203935.53880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203935.53884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203935.53894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203935.53951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203935.54064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203935.54098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203935.54190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203935.54274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203935.54323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203935.54391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203935.54500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203935.54546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203935.54594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203935.55017: variable 'network_connections' from source: play vars 19285 1727203935.55036: variable 'profile' from source: play vars 19285 1727203935.55199: variable 'profile' from source: play vars 19285 1727203935.55239: variable 'interface' from source: set_fact 19285 1727203935.55341: variable 'interface' from source: set_fact 19285 1727203935.55526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203935.55855: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203935.56081: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203935.56084: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203935.56216: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203935.56236: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203935.56264: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203935.56309: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203935.56480: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203935.56522: variable '__network_team_connections_defined' from source: role '' defaults 19285 1727203935.57164: variable 'network_connections' from source: play vars 19285 1727203935.57199: variable 'profile' from source: play vars 19285 1727203935.57347: variable 'profile' from source: play vars 19285 1727203935.57357: variable 'interface' from source: set_fact 19285 1727203935.57582: variable 'interface' from source: set_fact 19285 1727203935.57586: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 19285 1727203935.57588: when evaluation is False, skipping this task 19285 1727203935.57590: _execute() done 19285 1727203935.57592: dumping result to json 19285 1727203935.57594: done dumping result, returning 19285 1727203935.57595: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [028d2410-947f-f31b-fb3f-000000000065] 19285 1727203935.57604: sending task result for task 028d2410-947f-f31b-fb3f-000000000065 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 19285 1727203935.57927: no more pending results, returning what we have 19285 1727203935.57931: results queue empty 19285 1727203935.57932: checking for any_errors_fatal 19285 1727203935.57940: done checking for any_errors_fatal 19285 1727203935.57941: checking for max_fail_percentage 19285 1727203935.57943: done checking for max_fail_percentage 19285 1727203935.57944: checking to see if all hosts have failed and the running result is not ok 19285 1727203935.57945: done checking to see if all hosts have failed 19285 1727203935.57946: getting the remaining hosts for this loop 19285 1727203935.57948: done getting the remaining hosts for this loop 19285 1727203935.57951: getting the next task for host managed-node2 19285 1727203935.57959: done getting next task for host managed-node2 19285 1727203935.57963: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 19285 1727203935.57965: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203935.57982: getting variables 19285 1727203935.57984: in VariableManager get_vars() 19285 1727203935.58026: Calling all_inventory to load vars for managed-node2 19285 1727203935.58029: Calling groups_inventory to load vars for managed-node2 19285 1727203935.58032: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203935.58043: Calling all_plugins_play to load vars for managed-node2 19285 1727203935.58046: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203935.58049: Calling groups_plugins_play to load vars for managed-node2 19285 1727203935.59004: done sending task result for task 028d2410-947f-f31b-fb3f-000000000065 19285 1727203935.59007: WORKER PROCESS EXITING 19285 1727203935.61666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203935.64863: done with get_vars() 19285 1727203935.65108: done getting variables 19285 1727203935.65171: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:52:15 -0400 (0:00:00.203) 0:00:34.726 ***** 19285 1727203935.65206: entering _queue_task() for managed-node2/service 19285 1727203935.65969: worker is 1 (out of 1 available) 19285 1727203935.65989: exiting _queue_task() for managed-node2/service 19285 1727203935.66003: done queuing things up, now waiting for results queue to drain 19285 1727203935.66005: waiting for pending results... 19285 1727203935.67184: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 19285 1727203935.68216: in run() - task 028d2410-947f-f31b-fb3f-000000000066 19285 1727203935.68382: variable 'ansible_search_path' from source: unknown 19285 1727203935.68408: variable 'ansible_search_path' from source: unknown 19285 1727203935.68568: calling self._execute() 19285 1727203935.68802: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203935.69014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203935.69018: variable 'omit' from source: magic vars 19285 1727203935.70783: variable 'ansible_distribution_major_version' from source: facts 19285 1727203935.70788: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203935.71250: variable 'network_provider' from source: set_fact 19285 1727203935.71254: variable 'network_state' from source: role '' defaults 19285 1727203935.71257: Evaluated conditional (network_provider == "nm" or network_state != {}): True 19285 1727203935.71263: variable 'omit' from source: magic vars 19285 1727203935.71266: variable 'omit' from source: magic vars 19285 1727203935.71282: variable 'network_service_name' from source: role '' defaults 19285 1727203935.71353: variable 'network_service_name' from source: role '' defaults 19285 1727203935.71833: variable '__network_provider_setup' from source: role '' defaults 19285 1727203935.71941: variable '__network_service_name_default_nm' from source: role '' defaults 19285 1727203935.72081: variable '__network_service_name_default_nm' from source: role '' defaults 19285 1727203935.72085: variable '__network_packages_default_nm' from source: role '' defaults 19285 1727203935.72140: variable '__network_packages_default_nm' from source: role '' defaults 19285 1727203935.72928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203935.79850: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203935.80125: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203935.80172: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203935.80394: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203935.80426: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203935.80737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203935.80799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203935.81180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203935.81184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203935.81186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203935.81220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203935.81332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203935.81357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203935.81570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203935.81593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203935.82345: variable '__network_packages_default_gobject_packages' from source: role '' defaults 19285 1727203935.82784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203935.82811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203935.82838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203935.82970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203935.82992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203935.83131: variable 'ansible_python' from source: facts 19285 1727203935.83283: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 19285 1727203935.83580: variable '__network_wpa_supplicant_required' from source: role '' defaults 19285 1727203935.83583: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 19285 1727203935.83777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203935.84082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203935.84085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203935.84088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203935.84091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203935.84132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203935.84173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203935.84318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203935.84364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203935.84394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203935.84630: variable 'network_connections' from source: play vars 19285 1727203935.84691: variable 'profile' from source: play vars 19285 1727203935.84820: variable 'profile' from source: play vars 19285 1727203935.84831: variable 'interface' from source: set_fact 19285 1727203935.85011: variable 'interface' from source: set_fact 19285 1727203935.85213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203935.85616: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203935.85674: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203935.85802: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203935.85870: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203935.85991: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203935.86113: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203935.86194: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203935.86233: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203935.86329: variable '__network_wireless_connections_defined' from source: role '' defaults 19285 1727203935.86968: variable 'network_connections' from source: play vars 19285 1727203935.87020: variable 'profile' from source: play vars 19285 1727203935.87284: variable 'profile' from source: play vars 19285 1727203935.87287: variable 'interface' from source: set_fact 19285 1727203935.87330: variable 'interface' from source: set_fact 19285 1727203935.87371: variable '__network_packages_default_wireless' from source: role '' defaults 19285 1727203935.87585: variable '__network_wireless_connections_defined' from source: role '' defaults 19285 1727203935.88149: variable 'network_connections' from source: play vars 19285 1727203935.88280: variable 'profile' from source: play vars 19285 1727203935.88484: variable 'profile' from source: play vars 19285 1727203935.88488: variable 'interface' from source: set_fact 19285 1727203935.88558: variable 'interface' from source: set_fact 19285 1727203935.88620: variable '__network_packages_default_team' from source: role '' defaults 19285 1727203935.88773: variable '__network_team_connections_defined' from source: role '' defaults 19285 1727203935.89432: variable 'network_connections' from source: play vars 19285 1727203935.89490: variable 'profile' from source: play vars 19285 1727203935.89571: variable 'profile' from source: play vars 19285 1727203935.89636: variable 'interface' from source: set_fact 19285 1727203935.89813: variable 'interface' from source: set_fact 19285 1727203935.90067: variable '__network_service_name_default_initscripts' from source: role '' defaults 19285 1727203935.90069: variable '__network_service_name_default_initscripts' from source: role '' defaults 19285 1727203935.90072: variable '__network_packages_default_initscripts' from source: role '' defaults 19285 1727203935.90235: variable '__network_packages_default_initscripts' from source: role '' defaults 19285 1727203935.90660: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 19285 1727203935.91718: variable 'network_connections' from source: play vars 19285 1727203935.91923: variable 'profile' from source: play vars 19285 1727203935.91926: variable 'profile' from source: play vars 19285 1727203935.91928: variable 'interface' from source: set_fact 19285 1727203935.92009: variable 'interface' from source: set_fact 19285 1727203935.92181: variable 'ansible_distribution' from source: facts 19285 1727203935.92184: variable '__network_rh_distros' from source: role '' defaults 19285 1727203935.92186: variable 'ansible_distribution_major_version' from source: facts 19285 1727203935.92188: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 19285 1727203935.92444: variable 'ansible_distribution' from source: facts 19285 1727203935.92791: variable '__network_rh_distros' from source: role '' defaults 19285 1727203935.92794: variable 'ansible_distribution_major_version' from source: facts 19285 1727203935.92797: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 19285 1727203935.93079: variable 'ansible_distribution' from source: facts 19285 1727203935.93491: variable '__network_rh_distros' from source: role '' defaults 19285 1727203935.93494: variable 'ansible_distribution_major_version' from source: facts 19285 1727203935.93496: variable 'network_provider' from source: set_fact 19285 1727203935.93499: variable 'omit' from source: magic vars 19285 1727203935.93503: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203935.93535: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203935.93623: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203935.93925: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203935.93929: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203935.93932: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203935.93934: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203935.93936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203935.94114: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203935.94316: Set connection var ansible_pipelining to False 19285 1727203935.94327: Set connection var ansible_timeout to 10 19285 1727203935.94334: Set connection var ansible_shell_type to sh 19285 1727203935.94346: Set connection var ansible_shell_executable to /bin/sh 19285 1727203935.94367: Set connection var ansible_connection to ssh 19285 1727203935.94402: variable 'ansible_shell_executable' from source: unknown 19285 1727203935.94477: variable 'ansible_connection' from source: unknown 19285 1727203935.94486: variable 'ansible_module_compression' from source: unknown 19285 1727203935.94499: variable 'ansible_shell_type' from source: unknown 19285 1727203935.94507: variable 'ansible_shell_executable' from source: unknown 19285 1727203935.94542: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203935.94585: variable 'ansible_pipelining' from source: unknown 19285 1727203935.94593: variable 'ansible_timeout' from source: unknown 19285 1727203935.94602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203935.95009: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203935.95012: variable 'omit' from source: magic vars 19285 1727203935.95015: starting attempt loop 19285 1727203935.95017: running the handler 19285 1727203935.95119: variable 'ansible_facts' from source: unknown 19285 1727203935.96834: _low_level_execute_command(): starting 19285 1727203935.96982: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203935.98471: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203935.98489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203935.98521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 19285 1727203935.98632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203935.98695: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203935.98857: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203935.98919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203935.99182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203936.00971: stdout chunk (state=3): >>>/root <<< 19285 1727203936.01038: stdout chunk (state=3): >>><<< 19285 1727203936.01047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203936.01063: stderr chunk (state=3): >>><<< 19285 1727203936.01282: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203936.01286: _low_level_execute_command(): starting 19285 1727203936.01289: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203936.011986-22108-245261927264391 `" && echo ansible-tmp-1727203936.011986-22108-245261927264391="` echo /root/.ansible/tmp/ansible-tmp-1727203936.011986-22108-245261927264391 `" ) && sleep 0' 19285 1727203936.02207: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203936.02491: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203936.02560: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203936.02582: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203936.02787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203936.03053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203936.04895: stdout chunk (state=3): >>>ansible-tmp-1727203936.011986-22108-245261927264391=/root/.ansible/tmp/ansible-tmp-1727203936.011986-22108-245261927264391 <<< 19285 1727203936.05087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203936.05141: stdout chunk (state=3): >>><<< 19285 1727203936.05145: stderr chunk (state=3): >>><<< 19285 1727203936.05203: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203936.011986-22108-245261927264391=/root/.ansible/tmp/ansible-tmp-1727203936.011986-22108-245261927264391 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203936.05269: variable 'ansible_module_compression' from source: unknown 19285 1727203936.05511: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 19285 1727203936.05549: variable 'ansible_facts' from source: unknown 19285 1727203936.06127: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203936.011986-22108-245261927264391/AnsiballZ_systemd.py 19285 1727203936.07015: Sending initial data 19285 1727203936.07019: Sent initial data (155 bytes) 19285 1727203936.07969: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203936.08018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203936.08034: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203936.08092: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 19285 1727203936.08169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203936.08192: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203936.08206: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203936.08312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203936.09972: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203936.10039: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203936.10169: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmpqdeb1dko /root/.ansible/tmp/ansible-tmp-1727203936.011986-22108-245261927264391/AnsiballZ_systemd.py <<< 19285 1727203936.10172: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203936.011986-22108-245261927264391/AnsiballZ_systemd.py" <<< 19285 1727203936.10259: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmpqdeb1dko" to remote "/root/.ansible/tmp/ansible-tmp-1727203936.011986-22108-245261927264391/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203936.011986-22108-245261927264391/AnsiballZ_systemd.py" <<< 19285 1727203936.13115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203936.13439: stderr chunk (state=3): >>><<< 19285 1727203936.13444: stdout chunk (state=3): >>><<< 19285 1727203936.13446: done transferring module to remote 19285 1727203936.13449: _low_level_execute_command(): starting 19285 1727203936.13453: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203936.011986-22108-245261927264391/ /root/.ansible/tmp/ansible-tmp-1727203936.011986-22108-245261927264391/AnsiballZ_systemd.py && sleep 0' 19285 1727203936.14580: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203936.14795: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203936.14897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203936.16953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203936.16957: stdout chunk (state=3): >>><<< 19285 1727203936.16960: stderr chunk (state=3): >>><<< 19285 1727203936.16978: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203936.16988: _low_level_execute_command(): starting 19285 1727203936.16998: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203936.011986-22108-245261927264391/AnsiballZ_systemd.py && sleep 0' 19285 1727203936.18269: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203936.18286: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203936.18308: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203936.18578: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203936.47471: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "7081", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainStartTimestampMonotonic": "294798591", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainHandoffTimestampMonotonic": "294813549", "ExecMainPID": "7081", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4312", "MemoryCurrent": "4448256", "MemoryPeak": "7655424", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3297464320", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "637249000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredump<<< 19285 1727203936.47493: stdout chunk (state=3): >>>Receive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target cloud-init.service multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "sysinit.target systemd-journald.socket basic.target network-pre.target system.slice cloud-init-local.service dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:13 EDT", "StateChangeTimestampMonotonic": "399463156", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveExitTimestampMonotonic": "294799297", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveEnterTimestampMonotonic": "294888092", "ActiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveExitTimestampMonotonic": "294768391", "InactiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveEnterTimestampMonotonic": "294795966", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ConditionTimestampMonotonic": "294797207", "AssertTimestamp": "Tue 2024-09-24 14:48:28 EDT", "AssertTimestampMonotonic": "294797210", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a167241d4c7945a58749ffeda353964d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 19285 1727203936.49464: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203936.49467: stdout chunk (state=3): >>><<< 19285 1727203936.49469: stderr chunk (state=3): >>><<< 19285 1727203936.49682: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "7081", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainStartTimestampMonotonic": "294798591", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainHandoffTimestampMonotonic": "294813549", "ExecMainPID": "7081", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4312", "MemoryCurrent": "4448256", "MemoryPeak": "7655424", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3297464320", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "637249000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target cloud-init.service multi-user.target NetworkManager-wait-online.service shutdown.target", "After": "sysinit.target systemd-journald.socket basic.target network-pre.target system.slice cloud-init-local.service dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:13 EDT", "StateChangeTimestampMonotonic": "399463156", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveExitTimestampMonotonic": "294799297", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveEnterTimestampMonotonic": "294888092", "ActiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveExitTimestampMonotonic": "294768391", "InactiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveEnterTimestampMonotonic": "294795966", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ConditionTimestampMonotonic": "294797207", "AssertTimestamp": "Tue 2024-09-24 14:48:28 EDT", "AssertTimestampMonotonic": "294797210", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a167241d4c7945a58749ffeda353964d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203936.49692: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203936.011986-22108-245261927264391/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203936.49993: _low_level_execute_command(): starting 19285 1727203936.50348: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203936.011986-22108-245261927264391/ > /dev/null 2>&1 && sleep 0' 19285 1727203936.51833: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203936.51837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203936.51842: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203936.51844: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203936.51892: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203936.52204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203936.52467: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203936.54378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203936.54382: stdout chunk (state=3): >>><<< 19285 1727203936.54581: stderr chunk (state=3): >>><<< 19285 1727203936.54585: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203936.54587: handler run complete 19285 1727203936.54590: attempt loop complete, returning result 19285 1727203936.54592: _execute() done 19285 1727203936.54594: dumping result to json 19285 1727203936.54596: done dumping result, returning 19285 1727203936.54598: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [028d2410-947f-f31b-fb3f-000000000066] 19285 1727203936.54600: sending task result for task 028d2410-947f-f31b-fb3f-000000000066 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19285 1727203936.55295: no more pending results, returning what we have 19285 1727203936.55299: results queue empty 19285 1727203936.55300: checking for any_errors_fatal 19285 1727203936.55313: done checking for any_errors_fatal 19285 1727203936.55314: checking for max_fail_percentage 19285 1727203936.55315: done checking for max_fail_percentage 19285 1727203936.55316: checking to see if all hosts have failed and the running result is not ok 19285 1727203936.55317: done checking to see if all hosts have failed 19285 1727203936.55318: getting the remaining hosts for this loop 19285 1727203936.55320: done getting the remaining hosts for this loop 19285 1727203936.55323: getting the next task for host managed-node2 19285 1727203936.55330: done getting next task for host managed-node2 19285 1727203936.55334: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 19285 1727203936.55336: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203936.55346: getting variables 19285 1727203936.55347: in VariableManager get_vars() 19285 1727203936.55420: Calling all_inventory to load vars for managed-node2 19285 1727203936.55423: Calling groups_inventory to load vars for managed-node2 19285 1727203936.55426: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203936.55436: Calling all_plugins_play to load vars for managed-node2 19285 1727203936.55439: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203936.55442: Calling groups_plugins_play to load vars for managed-node2 19285 1727203936.56190: done sending task result for task 028d2410-947f-f31b-fb3f-000000000066 19285 1727203936.56194: WORKER PROCESS EXITING 19285 1727203936.59819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203936.63205: done with get_vars() 19285 1727203936.63350: done getting variables 19285 1727203936.63478: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:52:16 -0400 (0:00:00.983) 0:00:35.709 ***** 19285 1727203936.63510: entering _queue_task() for managed-node2/service 19285 1727203936.64444: worker is 1 (out of 1 available) 19285 1727203936.64456: exiting _queue_task() for managed-node2/service 19285 1727203936.64469: done queuing things up, now waiting for results queue to drain 19285 1727203936.64470: waiting for pending results... 19285 1727203936.64852: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 19285 1727203936.65081: in run() - task 028d2410-947f-f31b-fb3f-000000000067 19285 1727203936.65295: variable 'ansible_search_path' from source: unknown 19285 1727203936.65299: variable 'ansible_search_path' from source: unknown 19285 1727203936.65301: calling self._execute() 19285 1727203936.65466: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203936.65481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203936.65527: variable 'omit' from source: magic vars 19285 1727203936.66373: variable 'ansible_distribution_major_version' from source: facts 19285 1727203936.66395: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203936.66708: variable 'network_provider' from source: set_fact 19285 1727203936.66712: Evaluated conditional (network_provider == "nm"): True 19285 1727203936.66850: variable '__network_wpa_supplicant_required' from source: role '' defaults 19285 1727203936.67071: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 19285 1727203936.67469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203936.73138: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203936.73275: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203936.73582: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203936.73622: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203936.73856: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203936.73997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203936.74157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203936.74256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203936.74383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203936.74628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203936.74662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203936.74755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203936.74837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203936.74918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203936.74941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203936.75109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203936.75113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203936.75192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203936.75320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203936.75340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203936.75671: variable 'network_connections' from source: play vars 19285 1727203936.75729: variable 'profile' from source: play vars 19285 1727203936.75896: variable 'profile' from source: play vars 19285 1727203936.75934: variable 'interface' from source: set_fact 19285 1727203936.76182: variable 'interface' from source: set_fact 19285 1727203936.76333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19285 1727203936.76849: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19285 1727203936.77233: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19285 1727203936.77247: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19285 1727203936.77402: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19285 1727203936.77519: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19285 1727203936.77574: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19285 1727203936.77687: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203936.77780: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19285 1727203936.77883: variable '__network_wireless_connections_defined' from source: role '' defaults 19285 1727203936.78378: variable 'network_connections' from source: play vars 19285 1727203936.78581: variable 'profile' from source: play vars 19285 1727203936.78585: variable 'profile' from source: play vars 19285 1727203936.78587: variable 'interface' from source: set_fact 19285 1727203936.78723: variable 'interface' from source: set_fact 19285 1727203936.78811: Evaluated conditional (__network_wpa_supplicant_required): False 19285 1727203936.79050: when evaluation is False, skipping this task 19285 1727203936.79055: _execute() done 19285 1727203936.79068: dumping result to json 19285 1727203936.79071: done dumping result, returning 19285 1727203936.79073: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [028d2410-947f-f31b-fb3f-000000000067] 19285 1727203936.79077: sending task result for task 028d2410-947f-f31b-fb3f-000000000067 19285 1727203936.79143: done sending task result for task 028d2410-947f-f31b-fb3f-000000000067 19285 1727203936.79146: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 19285 1727203936.79203: no more pending results, returning what we have 19285 1727203936.79207: results queue empty 19285 1727203936.79208: checking for any_errors_fatal 19285 1727203936.79225: done checking for any_errors_fatal 19285 1727203936.79226: checking for max_fail_percentage 19285 1727203936.79229: done checking for max_fail_percentage 19285 1727203936.79229: checking to see if all hosts have failed and the running result is not ok 19285 1727203936.79230: done checking to see if all hosts have failed 19285 1727203936.79231: getting the remaining hosts for this loop 19285 1727203936.79233: done getting the remaining hosts for this loop 19285 1727203936.79237: getting the next task for host managed-node2 19285 1727203936.79245: done getting next task for host managed-node2 19285 1727203936.79250: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 19285 1727203936.79252: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203936.79269: getting variables 19285 1727203936.79271: in VariableManager get_vars() 19285 1727203936.79315: Calling all_inventory to load vars for managed-node2 19285 1727203936.79318: Calling groups_inventory to load vars for managed-node2 19285 1727203936.79321: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203936.79331: Calling all_plugins_play to load vars for managed-node2 19285 1727203936.79334: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203936.79338: Calling groups_plugins_play to load vars for managed-node2 19285 1727203936.84504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203936.88513: done with get_vars() 19285 1727203936.88547: done getting variables 19285 1727203936.88732: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:52:16 -0400 (0:00:00.252) 0:00:35.962 ***** 19285 1727203936.88766: entering _queue_task() for managed-node2/service 19285 1727203936.89863: worker is 1 (out of 1 available) 19285 1727203936.89874: exiting _queue_task() for managed-node2/service 19285 1727203936.89888: done queuing things up, now waiting for results queue to drain 19285 1727203936.89889: waiting for pending results... 19285 1727203936.90399: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 19285 1727203936.90741: in run() - task 028d2410-947f-f31b-fb3f-000000000068 19285 1727203936.90745: variable 'ansible_search_path' from source: unknown 19285 1727203936.90748: variable 'ansible_search_path' from source: unknown 19285 1727203936.90750: calling self._execute() 19285 1727203936.90871: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203936.90884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203936.90897: variable 'omit' from source: magic vars 19285 1727203936.91820: variable 'ansible_distribution_major_version' from source: facts 19285 1727203936.91838: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203936.92083: variable 'network_provider' from source: set_fact 19285 1727203936.92202: Evaluated conditional (network_provider == "initscripts"): False 19285 1727203936.92303: when evaluation is False, skipping this task 19285 1727203936.92306: _execute() done 19285 1727203936.92309: dumping result to json 19285 1727203936.92311: done dumping result, returning 19285 1727203936.92313: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [028d2410-947f-f31b-fb3f-000000000068] 19285 1727203936.92315: sending task result for task 028d2410-947f-f31b-fb3f-000000000068 19285 1727203936.92387: done sending task result for task 028d2410-947f-f31b-fb3f-000000000068 19285 1727203936.92391: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19285 1727203936.92458: no more pending results, returning what we have 19285 1727203936.92463: results queue empty 19285 1727203936.92464: checking for any_errors_fatal 19285 1727203936.92479: done checking for any_errors_fatal 19285 1727203936.92480: checking for max_fail_percentage 19285 1727203936.92483: done checking for max_fail_percentage 19285 1727203936.92484: checking to see if all hosts have failed and the running result is not ok 19285 1727203936.92485: done checking to see if all hosts have failed 19285 1727203936.92486: getting the remaining hosts for this loop 19285 1727203936.92488: done getting the remaining hosts for this loop 19285 1727203936.92492: getting the next task for host managed-node2 19285 1727203936.92505: done getting next task for host managed-node2 19285 1727203936.92512: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 19285 1727203936.92515: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203936.92530: getting variables 19285 1727203936.92531: in VariableManager get_vars() 19285 1727203936.92570: Calling all_inventory to load vars for managed-node2 19285 1727203936.92573: Calling groups_inventory to load vars for managed-node2 19285 1727203936.92779: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203936.92793: Calling all_plugins_play to load vars for managed-node2 19285 1727203936.92797: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203936.92800: Calling groups_plugins_play to load vars for managed-node2 19285 1727203936.96530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203937.00249: done with get_vars() 19285 1727203937.00281: done getting variables 19285 1727203937.00479: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:52:17 -0400 (0:00:00.117) 0:00:36.079 ***** 19285 1727203937.00511: entering _queue_task() for managed-node2/copy 19285 1727203937.01619: worker is 1 (out of 1 available) 19285 1727203937.01632: exiting _queue_task() for managed-node2/copy 19285 1727203937.01644: done queuing things up, now waiting for results queue to drain 19285 1727203937.01645: waiting for pending results... 19285 1727203937.02424: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 19285 1727203937.02430: in run() - task 028d2410-947f-f31b-fb3f-000000000069 19285 1727203937.02433: variable 'ansible_search_path' from source: unknown 19285 1727203937.02436: variable 'ansible_search_path' from source: unknown 19285 1727203937.02442: calling self._execute() 19285 1727203937.02630: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203937.02639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203937.02798: variable 'omit' from source: magic vars 19285 1727203937.03748: variable 'ansible_distribution_major_version' from source: facts 19285 1727203937.03791: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203937.04093: variable 'network_provider' from source: set_fact 19285 1727203937.04112: Evaluated conditional (network_provider == "initscripts"): False 19285 1727203937.04280: when evaluation is False, skipping this task 19285 1727203937.04287: _execute() done 19285 1727203937.04291: dumping result to json 19285 1727203937.04296: done dumping result, returning 19285 1727203937.04299: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [028d2410-947f-f31b-fb3f-000000000069] 19285 1727203937.04302: sending task result for task 028d2410-947f-f31b-fb3f-000000000069 19285 1727203937.04647: done sending task result for task 028d2410-947f-f31b-fb3f-000000000069 19285 1727203937.04651: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 19285 1727203937.04720: no more pending results, returning what we have 19285 1727203937.04725: results queue empty 19285 1727203937.04729: checking for any_errors_fatal 19285 1727203937.04741: done checking for any_errors_fatal 19285 1727203937.04742: checking for max_fail_percentage 19285 1727203937.04745: done checking for max_fail_percentage 19285 1727203937.04746: checking to see if all hosts have failed and the running result is not ok 19285 1727203937.04747: done checking to see if all hosts have failed 19285 1727203937.04747: getting the remaining hosts for this loop 19285 1727203937.04749: done getting the remaining hosts for this loop 19285 1727203937.04753: getting the next task for host managed-node2 19285 1727203937.04766: done getting next task for host managed-node2 19285 1727203937.04771: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 19285 1727203937.04777: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203937.04793: getting variables 19285 1727203937.04795: in VariableManager get_vars() 19285 1727203937.04839: Calling all_inventory to load vars for managed-node2 19285 1727203937.04843: Calling groups_inventory to load vars for managed-node2 19285 1727203937.04845: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203937.04863: Calling all_plugins_play to load vars for managed-node2 19285 1727203937.04867: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203937.04870: Calling groups_plugins_play to load vars for managed-node2 19285 1727203937.16887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203937.19223: done with get_vars() 19285 1727203937.19288: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:52:17 -0400 (0:00:00.189) 0:00:36.269 ***** 19285 1727203937.19478: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 19285 1727203937.20379: worker is 1 (out of 1 available) 19285 1727203937.20392: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 19285 1727203937.20405: done queuing things up, now waiting for results queue to drain 19285 1727203937.20406: waiting for pending results... 19285 1727203937.21347: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 19285 1727203937.21628: in run() - task 028d2410-947f-f31b-fb3f-00000000006a 19285 1727203937.21761: variable 'ansible_search_path' from source: unknown 19285 1727203937.21765: variable 'ansible_search_path' from source: unknown 19285 1727203937.21882: calling self._execute() 19285 1727203937.22141: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203937.22146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203937.22159: variable 'omit' from source: magic vars 19285 1727203937.22695: variable 'ansible_distribution_major_version' from source: facts 19285 1727203937.22741: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203937.22744: variable 'omit' from source: magic vars 19285 1727203937.22760: variable 'omit' from source: magic vars 19285 1727203937.23011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19285 1727203937.25769: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19285 1727203937.25833: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19285 1727203937.25907: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19285 1727203937.25949: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19285 1727203937.25985: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19285 1727203937.26202: variable 'network_provider' from source: set_fact 19285 1727203937.26320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19285 1727203937.26369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19285 1727203937.26406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19285 1727203937.26472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19285 1727203937.26592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19285 1727203937.26601: variable 'omit' from source: magic vars 19285 1727203937.26815: variable 'omit' from source: magic vars 19285 1727203937.26995: variable 'network_connections' from source: play vars 19285 1727203937.27009: variable 'profile' from source: play vars 19285 1727203937.27119: variable 'profile' from source: play vars 19285 1727203937.27123: variable 'interface' from source: set_fact 19285 1727203937.27195: variable 'interface' from source: set_fact 19285 1727203937.27342: variable 'omit' from source: magic vars 19285 1727203937.27356: variable '__lsr_ansible_managed' from source: task vars 19285 1727203937.27471: variable '__lsr_ansible_managed' from source: task vars 19285 1727203937.27805: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 19285 1727203937.28039: Loaded config def from plugin (lookup/template) 19285 1727203937.28049: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 19285 1727203937.28086: File lookup term: get_ansible_managed.j2 19285 1727203937.28089: variable 'ansible_search_path' from source: unknown 19285 1727203937.28092: evaluation_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 19285 1727203937.28108: search_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 19285 1727203937.28145: variable 'ansible_search_path' from source: unknown 19285 1727203937.37108: variable 'ansible_managed' from source: unknown 19285 1727203937.37538: variable 'omit' from source: magic vars 19285 1727203937.37542: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203937.37545: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203937.37548: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203937.37550: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203937.37553: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203937.37556: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203937.37558: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203937.37563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203937.37565: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203937.37568: Set connection var ansible_pipelining to False 19285 1727203937.37570: Set connection var ansible_timeout to 10 19285 1727203937.37572: Set connection var ansible_shell_type to sh 19285 1727203937.37574: Set connection var ansible_shell_executable to /bin/sh 19285 1727203937.37581: Set connection var ansible_connection to ssh 19285 1727203937.37584: variable 'ansible_shell_executable' from source: unknown 19285 1727203937.37586: variable 'ansible_connection' from source: unknown 19285 1727203937.37587: variable 'ansible_module_compression' from source: unknown 19285 1727203937.37589: variable 'ansible_shell_type' from source: unknown 19285 1727203937.37591: variable 'ansible_shell_executable' from source: unknown 19285 1727203937.37593: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203937.37595: variable 'ansible_pipelining' from source: unknown 19285 1727203937.37597: variable 'ansible_timeout' from source: unknown 19285 1727203937.37599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203937.37712: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19285 1727203937.37724: variable 'omit' from source: magic vars 19285 1727203937.37727: starting attempt loop 19285 1727203937.37729: running the handler 19285 1727203937.37740: _low_level_execute_command(): starting 19285 1727203937.37746: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203937.38582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203937.38586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203937.38588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203937.38591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203937.38594: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203937.38597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203937.38626: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203937.38728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203937.40438: stdout chunk (state=3): >>>/root <<< 19285 1727203937.40573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203937.40580: stdout chunk (state=3): >>><<< 19285 1727203937.40590: stderr chunk (state=3): >>><<< 19285 1727203937.40611: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203937.40623: _low_level_execute_command(): starting 19285 1727203937.40629: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203937.4061105-22239-63330503465414 `" && echo ansible-tmp-1727203937.4061105-22239-63330503465414="` echo /root/.ansible/tmp/ansible-tmp-1727203937.4061105-22239-63330503465414 `" ) && sleep 0' 19285 1727203937.41670: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203937.41686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203937.41702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203937.41723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203937.41781: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203937.41841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203937.41865: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203937.42095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203937.42181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203937.44228: stdout chunk (state=3): >>>ansible-tmp-1727203937.4061105-22239-63330503465414=/root/.ansible/tmp/ansible-tmp-1727203937.4061105-22239-63330503465414 <<< 19285 1727203937.44323: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203937.44326: stdout chunk (state=3): >>><<< 19285 1727203937.44329: stderr chunk (state=3): >>><<< 19285 1727203937.44348: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203937.4061105-22239-63330503465414=/root/.ansible/tmp/ansible-tmp-1727203937.4061105-22239-63330503465414 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203937.44413: variable 'ansible_module_compression' from source: unknown 19285 1727203937.44683: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 19285 1727203937.44687: variable 'ansible_facts' from source: unknown 19285 1727203937.44910: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203937.4061105-22239-63330503465414/AnsiballZ_network_connections.py 19285 1727203937.45357: Sending initial data 19285 1727203937.45362: Sent initial data (167 bytes) 19285 1727203937.46455: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203937.46472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 19285 1727203937.46491: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203937.46662: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203937.46773: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203937.46881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203937.48513: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19285 1727203937.48521: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203937.48582: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203937.48651: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmp63fi6azo /root/.ansible/tmp/ansible-tmp-1727203937.4061105-22239-63330503465414/AnsiballZ_network_connections.py <<< 19285 1727203937.48750: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203937.4061105-22239-63330503465414/AnsiballZ_network_connections.py" <<< 19285 1727203937.48804: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmp63fi6azo" to remote "/root/.ansible/tmp/ansible-tmp-1727203937.4061105-22239-63330503465414/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203937.4061105-22239-63330503465414/AnsiballZ_network_connections.py" <<< 19285 1727203937.50934: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203937.50994: stderr chunk (state=3): >>><<< 19285 1727203937.51004: stdout chunk (state=3): >>><<< 19285 1727203937.51123: done transferring module to remote 19285 1727203937.51173: _low_level_execute_command(): starting 19285 1727203937.51282: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203937.4061105-22239-63330503465414/ /root/.ansible/tmp/ansible-tmp-1727203937.4061105-22239-63330503465414/AnsiballZ_network_connections.py && sleep 0' 19285 1727203937.52090: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203937.52111: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203937.52125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203937.52195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203937.52250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203937.52273: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203937.52295: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203937.52408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203937.54350: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203937.54582: stdout chunk (state=3): >>><<< 19285 1727203937.54585: stderr chunk (state=3): >>><<< 19285 1727203937.54588: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203937.54590: _low_level_execute_command(): starting 19285 1727203937.54592: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203937.4061105-22239-63330503465414/AnsiballZ_network_connections.py && sleep 0' 19285 1727203937.55182: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203937.55201: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203937.55292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203937.55317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203937.55339: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203937.55354: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203937.55466: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203937.82605: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_k35nrcdx/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_k35nrcdx/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on LSR-TST-br31/a5e85d14-14c9-4d10-940b-6ee660088f46: error=unknown <<< 19285 1727203937.82725: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 19285 1727203937.84782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203937.84786: stdout chunk (state=3): >>><<< 19285 1727203937.84788: stderr chunk (state=3): >>><<< 19285 1727203937.84791: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_k35nrcdx/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_k35nrcdx/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on LSR-TST-br31/a5e85d14-14c9-4d10-940b-6ee660088f46: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203937.84793: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203937.4061105-22239-63330503465414/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203937.84806: _low_level_execute_command(): starting 19285 1727203937.84884: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203937.4061105-22239-63330503465414/ > /dev/null 2>&1 && sleep 0' 19285 1727203937.86075: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203937.86217: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203937.86484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203937.87026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203937.88747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203937.88751: stderr chunk (state=3): >>><<< 19285 1727203937.88753: stdout chunk (state=3): >>><<< 19285 1727203937.88756: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203937.88758: handler run complete 19285 1727203937.88903: attempt loop complete, returning result 19285 1727203937.88907: _execute() done 19285 1727203937.88910: dumping result to json 19285 1727203937.88912: done dumping result, returning 19285 1727203937.88924: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [028d2410-947f-f31b-fb3f-00000000006a] 19285 1727203937.88929: sending task result for task 028d2410-947f-f31b-fb3f-00000000006a 19285 1727203937.89051: done sending task result for task 028d2410-947f-f31b-fb3f-00000000006a 19285 1727203937.89055: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 19285 1727203937.89308: no more pending results, returning what we have 19285 1727203937.89313: results queue empty 19285 1727203937.89314: checking for any_errors_fatal 19285 1727203937.89337: done checking for any_errors_fatal 19285 1727203937.89338: checking for max_fail_percentage 19285 1727203937.89341: done checking for max_fail_percentage 19285 1727203937.89342: checking to see if all hosts have failed and the running result is not ok 19285 1727203937.89343: done checking to see if all hosts have failed 19285 1727203937.89343: getting the remaining hosts for this loop 19285 1727203937.89345: done getting the remaining hosts for this loop 19285 1727203937.89350: getting the next task for host managed-node2 19285 1727203937.89357: done getting next task for host managed-node2 19285 1727203937.89364: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 19285 1727203937.89367: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203937.89420: getting variables 19285 1727203937.89423: in VariableManager get_vars() 19285 1727203937.89563: Calling all_inventory to load vars for managed-node2 19285 1727203937.89567: Calling groups_inventory to load vars for managed-node2 19285 1727203937.89570: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203937.89646: Calling all_plugins_play to load vars for managed-node2 19285 1727203937.89658: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203937.89663: Calling groups_plugins_play to load vars for managed-node2 19285 1727203937.91488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203937.93568: done with get_vars() 19285 1727203937.93592: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:52:17 -0400 (0:00:00.741) 0:00:37.011 ***** 19285 1727203937.93681: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 19285 1727203937.94483: worker is 1 (out of 1 available) 19285 1727203937.94495: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 19285 1727203937.94506: done queuing things up, now waiting for results queue to drain 19285 1727203937.94507: waiting for pending results... 19285 1727203937.94771: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 19285 1727203937.94954: in run() - task 028d2410-947f-f31b-fb3f-00000000006b 19285 1727203937.95172: variable 'ansible_search_path' from source: unknown 19285 1727203937.95177: variable 'ansible_search_path' from source: unknown 19285 1727203937.95180: calling self._execute() 19285 1727203937.95350: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203937.95400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203937.95415: variable 'omit' from source: magic vars 19285 1727203937.96402: variable 'ansible_distribution_major_version' from source: facts 19285 1727203937.96420: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203937.96646: variable 'network_state' from source: role '' defaults 19285 1727203937.96671: Evaluated conditional (network_state != {}): False 19285 1727203937.96683: when evaluation is False, skipping this task 19285 1727203937.96691: _execute() done 19285 1727203937.96701: dumping result to json 19285 1727203937.96708: done dumping result, returning 19285 1727203937.96718: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [028d2410-947f-f31b-fb3f-00000000006b] 19285 1727203937.96728: sending task result for task 028d2410-947f-f31b-fb3f-00000000006b skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19285 1727203937.96989: no more pending results, returning what we have 19285 1727203937.96994: results queue empty 19285 1727203937.96995: checking for any_errors_fatal 19285 1727203937.97010: done checking for any_errors_fatal 19285 1727203937.97011: checking for max_fail_percentage 19285 1727203937.97013: done checking for max_fail_percentage 19285 1727203937.97014: checking to see if all hosts have failed and the running result is not ok 19285 1727203937.97015: done checking to see if all hosts have failed 19285 1727203937.97016: getting the remaining hosts for this loop 19285 1727203937.97026: done getting the remaining hosts for this loop 19285 1727203937.97030: getting the next task for host managed-node2 19285 1727203937.97038: done getting next task for host managed-node2 19285 1727203937.97042: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 19285 1727203937.97045: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203937.97062: getting variables 19285 1727203937.97064: in VariableManager get_vars() 19285 1727203937.97104: Calling all_inventory to load vars for managed-node2 19285 1727203937.97106: Calling groups_inventory to load vars for managed-node2 19285 1727203937.97109: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203937.97121: Calling all_plugins_play to load vars for managed-node2 19285 1727203937.97124: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203937.97246: Calling groups_plugins_play to load vars for managed-node2 19285 1727203937.97264: done sending task result for task 028d2410-947f-f31b-fb3f-00000000006b 19285 1727203937.97268: WORKER PROCESS EXITING 19285 1727203937.99141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203938.01819: done with get_vars() 19285 1727203938.01868: done getting variables 19285 1727203938.01935: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:52:18 -0400 (0:00:00.083) 0:00:37.094 ***** 19285 1727203938.01998: entering _queue_task() for managed-node2/debug 19285 1727203938.02509: worker is 1 (out of 1 available) 19285 1727203938.02522: exiting _queue_task() for managed-node2/debug 19285 1727203938.02566: done queuing things up, now waiting for results queue to drain 19285 1727203938.02568: waiting for pending results... 19285 1727203938.03113: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 19285 1727203938.03282: in run() - task 028d2410-947f-f31b-fb3f-00000000006c 19285 1727203938.03286: variable 'ansible_search_path' from source: unknown 19285 1727203938.03289: variable 'ansible_search_path' from source: unknown 19285 1727203938.03320: calling self._execute() 19285 1727203938.03532: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203938.03536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203938.03539: variable 'omit' from source: magic vars 19285 1727203938.04045: variable 'ansible_distribution_major_version' from source: facts 19285 1727203938.04074: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203938.04128: variable 'omit' from source: magic vars 19285 1727203938.04193: variable 'omit' from source: magic vars 19285 1727203938.04285: variable 'omit' from source: magic vars 19285 1727203938.04290: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203938.04334: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203938.04363: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203938.04394: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203938.04416: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203938.04449: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203938.04457: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203938.04502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203938.04584: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203938.04597: Set connection var ansible_pipelining to False 19285 1727203938.04615: Set connection var ansible_timeout to 10 19285 1727203938.04627: Set connection var ansible_shell_type to sh 19285 1727203938.04640: Set connection var ansible_shell_executable to /bin/sh 19285 1727203938.04680: Set connection var ansible_connection to ssh 19285 1727203938.04684: variable 'ansible_shell_executable' from source: unknown 19285 1727203938.04686: variable 'ansible_connection' from source: unknown 19285 1727203938.04689: variable 'ansible_module_compression' from source: unknown 19285 1727203938.04697: variable 'ansible_shell_type' from source: unknown 19285 1727203938.04704: variable 'ansible_shell_executable' from source: unknown 19285 1727203938.04718: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203938.04731: variable 'ansible_pipelining' from source: unknown 19285 1727203938.04880: variable 'ansible_timeout' from source: unknown 19285 1727203938.04883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203938.04901: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203938.04918: variable 'omit' from source: magic vars 19285 1727203938.04928: starting attempt loop 19285 1727203938.04935: running the handler 19285 1727203938.05077: variable '__network_connections_result' from source: set_fact 19285 1727203938.05139: handler run complete 19285 1727203938.05162: attempt loop complete, returning result 19285 1727203938.05169: _execute() done 19285 1727203938.05180: dumping result to json 19285 1727203938.05187: done dumping result, returning 19285 1727203938.05199: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [028d2410-947f-f31b-fb3f-00000000006c] 19285 1727203938.05280: sending task result for task 028d2410-947f-f31b-fb3f-00000000006c ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 19285 1727203938.05497: no more pending results, returning what we have 19285 1727203938.05501: results queue empty 19285 1727203938.05502: checking for any_errors_fatal 19285 1727203938.05511: done checking for any_errors_fatal 19285 1727203938.05511: checking for max_fail_percentage 19285 1727203938.05513: done checking for max_fail_percentage 19285 1727203938.05514: checking to see if all hosts have failed and the running result is not ok 19285 1727203938.05515: done checking to see if all hosts have failed 19285 1727203938.05516: getting the remaining hosts for this loop 19285 1727203938.05518: done getting the remaining hosts for this loop 19285 1727203938.05522: getting the next task for host managed-node2 19285 1727203938.05529: done getting next task for host managed-node2 19285 1727203938.05532: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 19285 1727203938.05534: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203938.05546: getting variables 19285 1727203938.05549: in VariableManager get_vars() 19285 1727203938.05590: Calling all_inventory to load vars for managed-node2 19285 1727203938.05594: Calling groups_inventory to load vars for managed-node2 19285 1727203938.05597: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203938.05606: Calling all_plugins_play to load vars for managed-node2 19285 1727203938.05610: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203938.05613: Calling groups_plugins_play to load vars for managed-node2 19285 1727203938.06189: done sending task result for task 028d2410-947f-f31b-fb3f-00000000006c 19285 1727203938.06192: WORKER PROCESS EXITING 19285 1727203938.08216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203938.10217: done with get_vars() 19285 1727203938.10241: done getting variables 19285 1727203938.10311: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:52:18 -0400 (0:00:00.083) 0:00:37.177 ***** 19285 1727203938.10340: entering _queue_task() for managed-node2/debug 19285 1727203938.10945: worker is 1 (out of 1 available) 19285 1727203938.10957: exiting _queue_task() for managed-node2/debug 19285 1727203938.10970: done queuing things up, now waiting for results queue to drain 19285 1727203938.10972: waiting for pending results... 19285 1727203938.11261: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 19285 1727203938.11483: in run() - task 028d2410-947f-f31b-fb3f-00000000006d 19285 1727203938.11486: variable 'ansible_search_path' from source: unknown 19285 1727203938.11489: variable 'ansible_search_path' from source: unknown 19285 1727203938.11492: calling self._execute() 19285 1727203938.11739: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203938.11743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203938.11745: variable 'omit' from source: magic vars 19285 1727203938.12055: variable 'ansible_distribution_major_version' from source: facts 19285 1727203938.12102: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203938.12114: variable 'omit' from source: magic vars 19285 1727203938.12190: variable 'omit' from source: magic vars 19285 1727203938.12301: variable 'omit' from source: magic vars 19285 1727203938.12402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203938.12484: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203938.12625: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203938.12628: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203938.12630: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203938.12633: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203938.12635: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203938.12637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203938.12745: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203938.12758: Set connection var ansible_pipelining to False 19285 1727203938.12772: Set connection var ansible_timeout to 10 19285 1727203938.12782: Set connection var ansible_shell_type to sh 19285 1727203938.12816: Set connection var ansible_shell_executable to /bin/sh 19285 1727203938.12850: Set connection var ansible_connection to ssh 19285 1727203938.12881: variable 'ansible_shell_executable' from source: unknown 19285 1727203938.12889: variable 'ansible_connection' from source: unknown 19285 1727203938.12897: variable 'ansible_module_compression' from source: unknown 19285 1727203938.12952: variable 'ansible_shell_type' from source: unknown 19285 1727203938.12955: variable 'ansible_shell_executable' from source: unknown 19285 1727203938.12957: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203938.12962: variable 'ansible_pipelining' from source: unknown 19285 1727203938.12964: variable 'ansible_timeout' from source: unknown 19285 1727203938.12966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203938.13210: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203938.13214: variable 'omit' from source: magic vars 19285 1727203938.13216: starting attempt loop 19285 1727203938.13219: running the handler 19285 1727203938.13277: variable '__network_connections_result' from source: set_fact 19285 1727203938.13498: variable '__network_connections_result' from source: set_fact 19285 1727203938.13645: handler run complete 19285 1727203938.13697: attempt loop complete, returning result 19285 1727203938.13726: _execute() done 19285 1727203938.13743: dumping result to json 19285 1727203938.13819: done dumping result, returning 19285 1727203938.13823: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [028d2410-947f-f31b-fb3f-00000000006d] 19285 1727203938.13825: sending task result for task 028d2410-947f-f31b-fb3f-00000000006d 19285 1727203938.13909: done sending task result for task 028d2410-947f-f31b-fb3f-00000000006d 19285 1727203938.13912: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 19285 1727203938.14005: no more pending results, returning what we have 19285 1727203938.14009: results queue empty 19285 1727203938.14012: checking for any_errors_fatal 19285 1727203938.14026: done checking for any_errors_fatal 19285 1727203938.14027: checking for max_fail_percentage 19285 1727203938.14029: done checking for max_fail_percentage 19285 1727203938.14030: checking to see if all hosts have failed and the running result is not ok 19285 1727203938.14031: done checking to see if all hosts have failed 19285 1727203938.14032: getting the remaining hosts for this loop 19285 1727203938.14034: done getting the remaining hosts for this loop 19285 1727203938.14038: getting the next task for host managed-node2 19285 1727203938.14046: done getting next task for host managed-node2 19285 1727203938.14050: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 19285 1727203938.14052: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203938.14068: getting variables 19285 1727203938.14070: in VariableManager get_vars() 19285 1727203938.14295: Calling all_inventory to load vars for managed-node2 19285 1727203938.14372: Calling groups_inventory to load vars for managed-node2 19285 1727203938.14403: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203938.14443: Calling all_plugins_play to load vars for managed-node2 19285 1727203938.14450: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203938.14455: Calling groups_plugins_play to load vars for managed-node2 19285 1727203938.17035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203938.19824: done with get_vars() 19285 1727203938.19857: done getting variables 19285 1727203938.19960: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:52:18 -0400 (0:00:00.096) 0:00:37.274 ***** 19285 1727203938.19997: entering _queue_task() for managed-node2/debug 19285 1727203938.20465: worker is 1 (out of 1 available) 19285 1727203938.20481: exiting _queue_task() for managed-node2/debug 19285 1727203938.20506: done queuing things up, now waiting for results queue to drain 19285 1727203938.20508: waiting for pending results... 19285 1727203938.21038: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 19285 1727203938.21122: in run() - task 028d2410-947f-f31b-fb3f-00000000006e 19285 1727203938.21127: variable 'ansible_search_path' from source: unknown 19285 1727203938.21130: variable 'ansible_search_path' from source: unknown 19285 1727203938.21143: calling self._execute() 19285 1727203938.21319: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203938.21324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203938.21328: variable 'omit' from source: magic vars 19285 1727203938.21798: variable 'ansible_distribution_major_version' from source: facts 19285 1727203938.21817: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203938.22138: variable 'network_state' from source: role '' defaults 19285 1727203938.22247: Evaluated conditional (network_state != {}): False 19285 1727203938.22300: when evaluation is False, skipping this task 19285 1727203938.22303: _execute() done 19285 1727203938.22306: dumping result to json 19285 1727203938.22422: done dumping result, returning 19285 1727203938.22428: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [028d2410-947f-f31b-fb3f-00000000006e] 19285 1727203938.22430: sending task result for task 028d2410-947f-f31b-fb3f-00000000006e 19285 1727203938.22498: done sending task result for task 028d2410-947f-f31b-fb3f-00000000006e 19285 1727203938.22501: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 19285 1727203938.22552: no more pending results, returning what we have 19285 1727203938.22557: results queue empty 19285 1727203938.22558: checking for any_errors_fatal 19285 1727203938.22571: done checking for any_errors_fatal 19285 1727203938.22572: checking for max_fail_percentage 19285 1727203938.22574: done checking for max_fail_percentage 19285 1727203938.22577: checking to see if all hosts have failed and the running result is not ok 19285 1727203938.22578: done checking to see if all hosts have failed 19285 1727203938.22579: getting the remaining hosts for this loop 19285 1727203938.22581: done getting the remaining hosts for this loop 19285 1727203938.22585: getting the next task for host managed-node2 19285 1727203938.22593: done getting next task for host managed-node2 19285 1727203938.22598: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 19285 1727203938.22601: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203938.22618: getting variables 19285 1727203938.22620: in VariableManager get_vars() 19285 1727203938.22663: Calling all_inventory to load vars for managed-node2 19285 1727203938.22666: Calling groups_inventory to load vars for managed-node2 19285 1727203938.22669: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203938.22790: Calling all_plugins_play to load vars for managed-node2 19285 1727203938.22795: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203938.22799: Calling groups_plugins_play to load vars for managed-node2 19285 1727203938.25677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203938.29287: done with get_vars() 19285 1727203938.29325: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:52:18 -0400 (0:00:00.094) 0:00:37.368 ***** 19285 1727203938.29433: entering _queue_task() for managed-node2/ping 19285 1727203938.29996: worker is 1 (out of 1 available) 19285 1727203938.30007: exiting _queue_task() for managed-node2/ping 19285 1727203938.30018: done queuing things up, now waiting for results queue to drain 19285 1727203938.30020: waiting for pending results... 19285 1727203938.30729: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 19285 1727203938.30736: in run() - task 028d2410-947f-f31b-fb3f-00000000006f 19285 1727203938.30739: variable 'ansible_search_path' from source: unknown 19285 1727203938.30743: variable 'ansible_search_path' from source: unknown 19285 1727203938.30881: calling self._execute() 19285 1727203938.30981: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203938.30988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203938.31027: variable 'omit' from source: magic vars 19285 1727203938.31591: variable 'ansible_distribution_major_version' from source: facts 19285 1727203938.31601: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203938.31608: variable 'omit' from source: magic vars 19285 1727203938.31643: variable 'omit' from source: magic vars 19285 1727203938.31670: variable 'omit' from source: magic vars 19285 1727203938.31743: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203938.31839: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203938.31844: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203938.31878: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203938.31895: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203938.31958: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203938.31961: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203938.31965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203938.32141: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203938.32145: Set connection var ansible_pipelining to False 19285 1727203938.32175: Set connection var ansible_timeout to 10 19285 1727203938.32179: Set connection var ansible_shell_type to sh 19285 1727203938.32183: Set connection var ansible_shell_executable to /bin/sh 19285 1727203938.32186: Set connection var ansible_connection to ssh 19285 1727203938.32225: variable 'ansible_shell_executable' from source: unknown 19285 1727203938.32228: variable 'ansible_connection' from source: unknown 19285 1727203938.32233: variable 'ansible_module_compression' from source: unknown 19285 1727203938.32235: variable 'ansible_shell_type' from source: unknown 19285 1727203938.32237: variable 'ansible_shell_executable' from source: unknown 19285 1727203938.32239: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203938.32241: variable 'ansible_pipelining' from source: unknown 19285 1727203938.32246: variable 'ansible_timeout' from source: unknown 19285 1727203938.32248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203938.32491: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19285 1727203938.32500: variable 'omit' from source: magic vars 19285 1727203938.32513: starting attempt loop 19285 1727203938.32516: running the handler 19285 1727203938.32556: _low_level_execute_command(): starting 19285 1727203938.32560: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203938.33181: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203938.33184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203938.33204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 19285 1727203938.33208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203938.33329: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203938.33482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203938.35112: stdout chunk (state=3): >>>/root <<< 19285 1727203938.35283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203938.35288: stdout chunk (state=3): >>><<< 19285 1727203938.35291: stderr chunk (state=3): >>><<< 19285 1727203938.35395: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203938.35399: _low_level_execute_command(): starting 19285 1727203938.35402: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203938.3535633-22287-16572650036535 `" && echo ansible-tmp-1727203938.3535633-22287-16572650036535="` echo /root/.ansible/tmp/ansible-tmp-1727203938.3535633-22287-16572650036535 `" ) && sleep 0' 19285 1727203938.36092: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203938.36109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 19285 1727203938.36127: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203938.36202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203938.36222: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203938.36239: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203938.36352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203938.38261: stdout chunk (state=3): >>>ansible-tmp-1727203938.3535633-22287-16572650036535=/root/.ansible/tmp/ansible-tmp-1727203938.3535633-22287-16572650036535 <<< 19285 1727203938.38427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203938.38430: stdout chunk (state=3): >>><<< 19285 1727203938.38432: stderr chunk (state=3): >>><<< 19285 1727203938.38581: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203938.3535633-22287-16572650036535=/root/.ansible/tmp/ansible-tmp-1727203938.3535633-22287-16572650036535 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203938.38584: variable 'ansible_module_compression' from source: unknown 19285 1727203938.38586: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 19285 1727203938.38598: variable 'ansible_facts' from source: unknown 19285 1727203938.38717: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203938.3535633-22287-16572650036535/AnsiballZ_ping.py 19285 1727203938.38942: Sending initial data 19285 1727203938.38982: Sent initial data (152 bytes) 19285 1727203938.39915: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203938.39993: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203938.40040: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203938.40138: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203938.40385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203938.41912: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203938.41994: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203938.42101: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmptikp0cpd /root/.ansible/tmp/ansible-tmp-1727203938.3535633-22287-16572650036535/AnsiballZ_ping.py <<< 19285 1727203938.42105: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203938.3535633-22287-16572650036535/AnsiballZ_ping.py" <<< 19285 1727203938.42168: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmptikp0cpd" to remote "/root/.ansible/tmp/ansible-tmp-1727203938.3535633-22287-16572650036535/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203938.3535633-22287-16572650036535/AnsiballZ_ping.py" <<< 19285 1727203938.43518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203938.43567: stderr chunk (state=3): >>><<< 19285 1727203938.43594: stdout chunk (state=3): >>><<< 19285 1727203938.43613: done transferring module to remote 19285 1727203938.43704: _low_level_execute_command(): starting 19285 1727203938.43707: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203938.3535633-22287-16572650036535/ /root/.ansible/tmp/ansible-tmp-1727203938.3535633-22287-16572650036535/AnsiballZ_ping.py && sleep 0' 19285 1727203938.44483: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203938.44600: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203938.44648: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203938.44790: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203938.46608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203938.46617: stdout chunk (state=3): >>><<< 19285 1727203938.46627: stderr chunk (state=3): >>><<< 19285 1727203938.46647: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203938.46729: _low_level_execute_command(): starting 19285 1727203938.46732: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203938.3535633-22287-16572650036535/AnsiballZ_ping.py && sleep 0' 19285 1727203938.47256: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203938.47273: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203938.47297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203938.47312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203938.47331: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203938.47343: stderr chunk (state=3): >>>debug2: match not found <<< 19285 1727203938.47356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203938.47378: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19285 1727203938.47465: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203938.47486: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203938.47503: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203938.47613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203938.62705: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 19285 1727203938.64120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203938.64137: stdout chunk (state=3): >>><<< 19285 1727203938.64152: stderr chunk (state=3): >>><<< 19285 1727203938.64177: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203938.64216: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203938.3535633-22287-16572650036535/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203938.64242: _low_level_execute_command(): starting 19285 1727203938.64258: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203938.3535633-22287-16572650036535/ > /dev/null 2>&1 && sleep 0' 19285 1727203938.65617: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203938.65697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203938.65823: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203938.65937: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203938.66046: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203938.66123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203938.68181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203938.68185: stdout chunk (state=3): >>><<< 19285 1727203938.68187: stderr chunk (state=3): >>><<< 19285 1727203938.68189: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203938.68197: handler run complete 19285 1727203938.68200: attempt loop complete, returning result 19285 1727203938.68202: _execute() done 19285 1727203938.68203: dumping result to json 19285 1727203938.68206: done dumping result, returning 19285 1727203938.68208: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [028d2410-947f-f31b-fb3f-00000000006f] 19285 1727203938.68210: sending task result for task 028d2410-947f-f31b-fb3f-00000000006f 19285 1727203938.68271: done sending task result for task 028d2410-947f-f31b-fb3f-00000000006f 19285 1727203938.68274: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 19285 1727203938.68346: no more pending results, returning what we have 19285 1727203938.68350: results queue empty 19285 1727203938.68351: checking for any_errors_fatal 19285 1727203938.68363: done checking for any_errors_fatal 19285 1727203938.68364: checking for max_fail_percentage 19285 1727203938.68367: done checking for max_fail_percentage 19285 1727203938.68368: checking to see if all hosts have failed and the running result is not ok 19285 1727203938.68369: done checking to see if all hosts have failed 19285 1727203938.68370: getting the remaining hosts for this loop 19285 1727203938.68372: done getting the remaining hosts for this loop 19285 1727203938.68379: getting the next task for host managed-node2 19285 1727203938.68389: done getting next task for host managed-node2 19285 1727203938.68392: ^ task is: TASK: meta (role_complete) 19285 1727203938.68394: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203938.68406: getting variables 19285 1727203938.68408: in VariableManager get_vars() 19285 1727203938.68451: Calling all_inventory to load vars for managed-node2 19285 1727203938.68455: Calling groups_inventory to load vars for managed-node2 19285 1727203938.68458: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203938.68472: Calling all_plugins_play to load vars for managed-node2 19285 1727203938.68476: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203938.68596: Calling groups_plugins_play to load vars for managed-node2 19285 1727203938.70935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203938.73672: done with get_vars() 19285 1727203938.73702: done getting variables 19285 1727203938.73809: done queuing things up, now waiting for results queue to drain 19285 1727203938.73811: results queue empty 19285 1727203938.73812: checking for any_errors_fatal 19285 1727203938.73815: done checking for any_errors_fatal 19285 1727203938.73815: checking for max_fail_percentage 19285 1727203938.73817: done checking for max_fail_percentage 19285 1727203938.73817: checking to see if all hosts have failed and the running result is not ok 19285 1727203938.73818: done checking to see if all hosts have failed 19285 1727203938.73819: getting the remaining hosts for this loop 19285 1727203938.73820: done getting the remaining hosts for this loop 19285 1727203938.73822: getting the next task for host managed-node2 19285 1727203938.73826: done getting next task for host managed-node2 19285 1727203938.73827: ^ task is: TASK: meta (flush_handlers) 19285 1727203938.73829: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203938.73832: getting variables 19285 1727203938.73832: in VariableManager get_vars() 19285 1727203938.73845: Calling all_inventory to load vars for managed-node2 19285 1727203938.73847: Calling groups_inventory to load vars for managed-node2 19285 1727203938.73849: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203938.73854: Calling all_plugins_play to load vars for managed-node2 19285 1727203938.73857: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203938.73868: Calling groups_plugins_play to load vars for managed-node2 19285 1727203938.75782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203938.76668: done with get_vars() 19285 1727203938.76686: done getting variables 19285 1727203938.76721: in VariableManager get_vars() 19285 1727203938.76730: Calling all_inventory to load vars for managed-node2 19285 1727203938.76731: Calling groups_inventory to load vars for managed-node2 19285 1727203938.76733: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203938.76736: Calling all_plugins_play to load vars for managed-node2 19285 1727203938.76737: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203938.76739: Calling groups_plugins_play to load vars for managed-node2 19285 1727203938.77397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203938.80133: done with get_vars() 19285 1727203938.80160: done queuing things up, now waiting for results queue to drain 19285 1727203938.80163: results queue empty 19285 1727203938.80164: checking for any_errors_fatal 19285 1727203938.80165: done checking for any_errors_fatal 19285 1727203938.80166: checking for max_fail_percentage 19285 1727203938.80167: done checking for max_fail_percentage 19285 1727203938.80168: checking to see if all hosts have failed and the running result is not ok 19285 1727203938.80169: done checking to see if all hosts have failed 19285 1727203938.80169: getting the remaining hosts for this loop 19285 1727203938.80170: done getting the remaining hosts for this loop 19285 1727203938.80173: getting the next task for host managed-node2 19285 1727203938.80179: done getting next task for host managed-node2 19285 1727203938.80180: ^ task is: TASK: meta (flush_handlers) 19285 1727203938.80182: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203938.80185: getting variables 19285 1727203938.80186: in VariableManager get_vars() 19285 1727203938.80197: Calling all_inventory to load vars for managed-node2 19285 1727203938.80199: Calling groups_inventory to load vars for managed-node2 19285 1727203938.80201: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203938.80208: Calling all_plugins_play to load vars for managed-node2 19285 1727203938.80211: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203938.80214: Calling groups_plugins_play to load vars for managed-node2 19285 1727203938.82593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203938.83646: done with get_vars() 19285 1727203938.83664: done getting variables 19285 1727203938.83700: in VariableManager get_vars() 19285 1727203938.83709: Calling all_inventory to load vars for managed-node2 19285 1727203938.83711: Calling groups_inventory to load vars for managed-node2 19285 1727203938.83712: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203938.83715: Calling all_plugins_play to load vars for managed-node2 19285 1727203938.83717: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203938.83720: Calling groups_plugins_play to load vars for managed-node2 19285 1727203938.84708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203938.86818: done with get_vars() 19285 1727203938.86851: done queuing things up, now waiting for results queue to drain 19285 1727203938.86853: results queue empty 19285 1727203938.86853: checking for any_errors_fatal 19285 1727203938.86855: done checking for any_errors_fatal 19285 1727203938.86856: checking for max_fail_percentage 19285 1727203938.86856: done checking for max_fail_percentage 19285 1727203938.86857: checking to see if all hosts have failed and the running result is not ok 19285 1727203938.86858: done checking to see if all hosts have failed 19285 1727203938.86858: getting the remaining hosts for this loop 19285 1727203938.86859: done getting the remaining hosts for this loop 19285 1727203938.86862: getting the next task for host managed-node2 19285 1727203938.86866: done getting next task for host managed-node2 19285 1727203938.86867: ^ task is: None 19285 1727203938.86868: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203938.86869: done queuing things up, now waiting for results queue to drain 19285 1727203938.86870: results queue empty 19285 1727203938.86871: checking for any_errors_fatal 19285 1727203938.86872: done checking for any_errors_fatal 19285 1727203938.86872: checking for max_fail_percentage 19285 1727203938.86873: done checking for max_fail_percentage 19285 1727203938.86874: checking to see if all hosts have failed and the running result is not ok 19285 1727203938.86875: done checking to see if all hosts have failed 19285 1727203938.86878: getting the next task for host managed-node2 19285 1727203938.86880: done getting next task for host managed-node2 19285 1727203938.86881: ^ task is: None 19285 1727203938.86882: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203938.86920: in VariableManager get_vars() 19285 1727203938.86949: done with get_vars() 19285 1727203938.86956: in VariableManager get_vars() 19285 1727203938.86965: done with get_vars() 19285 1727203938.86969: variable 'omit' from source: magic vars 19285 1727203938.87104: variable 'task' from source: play vars 19285 1727203938.87138: in VariableManager get_vars() 19285 1727203938.87153: done with get_vars() 19285 1727203938.87306: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_profile_absent.yml] ************************ 19285 1727203938.87683: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19285 1727203938.87704: getting the remaining hosts for this loop 19285 1727203938.87705: done getting the remaining hosts for this loop 19285 1727203938.87707: getting the next task for host managed-node2 19285 1727203938.87709: done getting next task for host managed-node2 19285 1727203938.87711: ^ task is: TASK: Gathering Facts 19285 1727203938.87712: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203938.87713: getting variables 19285 1727203938.87713: in VariableManager get_vars() 19285 1727203938.87719: Calling all_inventory to load vars for managed-node2 19285 1727203938.87721: Calling groups_inventory to load vars for managed-node2 19285 1727203938.87722: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203938.87726: Calling all_plugins_play to load vars for managed-node2 19285 1727203938.87727: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203938.87729: Calling groups_plugins_play to load vars for managed-node2 19285 1727203938.88510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203938.90101: done with get_vars() 19285 1727203938.90144: done getting variables 19285 1727203938.90215: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Tuesday 24 September 2024 14:52:18 -0400 (0:00:00.608) 0:00:37.976 ***** 19285 1727203938.90240: entering _queue_task() for managed-node2/gather_facts 19285 1727203938.91050: worker is 1 (out of 1 available) 19285 1727203938.91064: exiting _queue_task() for managed-node2/gather_facts 19285 1727203938.91080: done queuing things up, now waiting for results queue to drain 19285 1727203938.91081: waiting for pending results... 19285 1727203938.92087: running TaskExecutor() for managed-node2/TASK: Gathering Facts 19285 1727203938.92801: in run() - task 028d2410-947f-f31b-fb3f-00000000046e 19285 1727203938.92806: variable 'ansible_search_path' from source: unknown 19285 1727203938.93080: calling self._execute() 19285 1727203938.93300: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203938.93605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203938.93636: variable 'omit' from source: magic vars 19285 1727203938.95519: variable 'ansible_distribution_major_version' from source: facts 19285 1727203938.95524: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203938.95528: variable 'omit' from source: magic vars 19285 1727203938.95563: variable 'omit' from source: magic vars 19285 1727203938.95619: variable 'omit' from source: magic vars 19285 1727203938.95727: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203938.95958: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203938.95961: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203938.95965: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203938.95967: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203938.96178: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203938.96183: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203938.96186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203938.96339: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203938.96352: Set connection var ansible_pipelining to False 19285 1727203938.96405: Set connection var ansible_timeout to 10 19285 1727203938.96414: Set connection var ansible_shell_type to sh 19285 1727203938.96427: Set connection var ansible_shell_executable to /bin/sh 19285 1727203938.96436: Set connection var ansible_connection to ssh 19285 1727203938.96509: variable 'ansible_shell_executable' from source: unknown 19285 1727203938.96517: variable 'ansible_connection' from source: unknown 19285 1727203938.96525: variable 'ansible_module_compression' from source: unknown 19285 1727203938.96532: variable 'ansible_shell_type' from source: unknown 19285 1727203938.96826: variable 'ansible_shell_executable' from source: unknown 19285 1727203938.96829: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203938.96832: variable 'ansible_pipelining' from source: unknown 19285 1727203938.96834: variable 'ansible_timeout' from source: unknown 19285 1727203938.96836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203938.97283: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203938.97286: variable 'omit' from source: magic vars 19285 1727203938.97290: starting attempt loop 19285 1727203938.97293: running the handler 19285 1727203938.97296: variable 'ansible_facts' from source: unknown 19285 1727203938.97298: _low_level_execute_command(): starting 19285 1727203938.97299: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203938.98533: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203938.98549: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203938.98563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203938.98585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203938.98605: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203938.98617: stderr chunk (state=3): >>>debug2: match not found <<< 19285 1727203938.98631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203938.98649: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19285 1727203938.98663: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 19285 1727203938.98674: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19285 1727203938.98710: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203938.98773: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203938.98794: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203938.98820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203938.99033: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203939.00725: stdout chunk (state=3): >>>/root <<< 19285 1727203939.00899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203939.00927: stderr chunk (state=3): >>><<< 19285 1727203939.00991: stdout chunk (state=3): >>><<< 19285 1727203939.01030: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203939.01059: _low_level_execute_command(): starting 19285 1727203939.01119: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203939.0104-22320-174062280325637 `" && echo ansible-tmp-1727203939.0104-22320-174062280325637="` echo /root/.ansible/tmp/ansible-tmp-1727203939.0104-22320-174062280325637 `" ) && sleep 0' 19285 1727203939.01851: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203939.01890: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203939.01991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203939.02020: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203939.02043: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203939.02138: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203939.02242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203939.04173: stdout chunk (state=3): >>>ansible-tmp-1727203939.0104-22320-174062280325637=/root/.ansible/tmp/ansible-tmp-1727203939.0104-22320-174062280325637 <<< 19285 1727203939.04372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203939.04377: stdout chunk (state=3): >>><<< 19285 1727203939.04380: stderr chunk (state=3): >>><<< 19285 1727203939.04441: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203939.0104-22320-174062280325637=/root/.ansible/tmp/ansible-tmp-1727203939.0104-22320-174062280325637 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203939.04445: variable 'ansible_module_compression' from source: unknown 19285 1727203939.04494: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19285 1727203939.04565: variable 'ansible_facts' from source: unknown 19285 1727203939.04789: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203939.0104-22320-174062280325637/AnsiballZ_setup.py 19285 1727203939.04929: Sending initial data 19285 1727203939.04990: Sent initial data (151 bytes) 19285 1727203939.05947: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203939.05964: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203939.06005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203939.06162: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203939.06195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203939.06352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203939.08062: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203939.08132: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203939.08211: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmpw6selmo8 /root/.ansible/tmp/ansible-tmp-1727203939.0104-22320-174062280325637/AnsiballZ_setup.py <<< 19285 1727203939.08214: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203939.0104-22320-174062280325637/AnsiballZ_setup.py" <<< 19285 1727203939.08313: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmpw6selmo8" to remote "/root/.ansible/tmp/ansible-tmp-1727203939.0104-22320-174062280325637/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203939.0104-22320-174062280325637/AnsiballZ_setup.py" <<< 19285 1727203939.10035: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203939.10191: stderr chunk (state=3): >>><<< 19285 1727203939.10195: stdout chunk (state=3): >>><<< 19285 1727203939.10197: done transferring module to remote 19285 1727203939.10204: _low_level_execute_command(): starting 19285 1727203939.10207: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203939.0104-22320-174062280325637/ /root/.ansible/tmp/ansible-tmp-1727203939.0104-22320-174062280325637/AnsiballZ_setup.py && sleep 0' 19285 1727203939.11339: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203939.11429: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203939.11441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203939.11461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203939.11538: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203939.11951: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203939.11955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203939.12015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203939.12110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203939.14216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203939.14220: stdout chunk (state=3): >>><<< 19285 1727203939.14222: stderr chunk (state=3): >>><<< 19285 1727203939.14244: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203939.14270: _low_level_execute_command(): starting 19285 1727203939.14304: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203939.0104-22320-174062280325637/AnsiballZ_setup.py && sleep 0' 19285 1727203939.15807: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203939.15851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 19285 1727203939.15888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203939.15973: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203939.16089: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203939.16290: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203939.78284: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_loadavg": {"1m": 0.6201171875, "5m": 0.4326171875, "15m": 0.2158203125}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2926, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 605, "free": 2926}, "nocache": {"free": 3283, "used": 248}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 525, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261787873280, "block_size": 4096, "block_total": 65519099, "block_available": 63913055, "block_used": 1606044, "inode_total": 131070960, "inode_available": 131027264, "inode_used": 43696, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_pkg_mgr": "dnf", "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "52", "second": "19", "epoch": "1727203939", "epoch_int": "1727203939", "date": "2024-09-24", "time": "14:52:19", "iso8601_micro": "2024-09-24T18:52:19.778170Z", "iso8601": "2024-09-24T18:52:19Z", "iso8601_basic": "20240924T145219778170", "iso8601_basic_short": "20240924T145219", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_service_mgr": "systemd", "ansible_iscsi_iqn": "", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19285 1727203939.80153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203939.80158: stdout chunk (state=3): >>><<< 19285 1727203939.80163: stderr chunk (state=3): >>><<< 19285 1727203939.80221: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_loadavg": {"1m": 0.6201171875, "5m": 0.4326171875, "15m": 0.2158203125}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2926, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 605, "free": 2926}, "nocache": {"free": 3283, "used": 248}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 525, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261787873280, "block_size": 4096, "block_total": 65519099, "block_available": 63913055, "block_used": 1606044, "inode_total": 131070960, "inode_available": 131027264, "inode_used": 43696, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_pkg_mgr": "dnf", "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "52", "second": "19", "epoch": "1727203939", "epoch_int": "1727203939", "date": "2024-09-24", "time": "14:52:19", "iso8601_micro": "2024-09-24T18:52:19.778170Z", "iso8601": "2024-09-24T18:52:19Z", "iso8601_basic": "20240924T145219778170", "iso8601_basic_short": "20240924T145219", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_service_mgr": "systemd", "ansible_iscsi_iqn": "", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203939.81244: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203939.0104-22320-174062280325637/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203939.81252: _low_level_execute_command(): starting 19285 1727203939.81263: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203939.0104-22320-174062280325637/ > /dev/null 2>&1 && sleep 0' 19285 1727203939.81857: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203939.81877: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203939.81894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203939.81912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203939.81930: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203939.81943: stderr chunk (state=3): >>>debug2: match not found <<< 19285 1727203939.81957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203939.81985: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19285 1727203939.81999: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 19285 1727203939.82010: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19285 1727203939.82090: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203939.82192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203939.82287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203939.84157: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203939.84315: stderr chunk (state=3): >>><<< 19285 1727203939.84325: stdout chunk (state=3): >>><<< 19285 1727203939.84347: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203939.84360: handler run complete 19285 1727203939.84538: variable 'ansible_facts' from source: unknown 19285 1727203939.84769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203939.85050: variable 'ansible_facts' from source: unknown 19285 1727203939.85209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203939.85369: attempt loop complete, returning result 19285 1727203939.85381: _execute() done 19285 1727203939.85388: dumping result to json 19285 1727203939.85421: done dumping result, returning 19285 1727203939.85438: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [028d2410-947f-f31b-fb3f-00000000046e] 19285 1727203939.85463: sending task result for task 028d2410-947f-f31b-fb3f-00000000046e 19285 1727203939.87136: done sending task result for task 028d2410-947f-f31b-fb3f-00000000046e 19285 1727203939.87140: WORKER PROCESS EXITING ok: [managed-node2] 19285 1727203939.87874: no more pending results, returning what we have 19285 1727203939.87879: results queue empty 19285 1727203939.87880: checking for any_errors_fatal 19285 1727203939.87882: done checking for any_errors_fatal 19285 1727203939.87882: checking for max_fail_percentage 19285 1727203939.87884: done checking for max_fail_percentage 19285 1727203939.87885: checking to see if all hosts have failed and the running result is not ok 19285 1727203939.87886: done checking to see if all hosts have failed 19285 1727203939.87886: getting the remaining hosts for this loop 19285 1727203939.87888: done getting the remaining hosts for this loop 19285 1727203939.87891: getting the next task for host managed-node2 19285 1727203939.87896: done getting next task for host managed-node2 19285 1727203939.87897: ^ task is: TASK: meta (flush_handlers) 19285 1727203939.87899: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203939.87902: getting variables 19285 1727203939.87903: in VariableManager get_vars() 19285 1727203939.88042: Calling all_inventory to load vars for managed-node2 19285 1727203939.88045: Calling groups_inventory to load vars for managed-node2 19285 1727203939.88049: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203939.88061: Calling all_plugins_play to load vars for managed-node2 19285 1727203939.88065: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203939.88068: Calling groups_plugins_play to load vars for managed-node2 19285 1727203939.91107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203939.94391: done with get_vars() 19285 1727203939.94420: done getting variables 19285 1727203939.94510: in VariableManager get_vars() 19285 1727203939.94522: Calling all_inventory to load vars for managed-node2 19285 1727203939.94524: Calling groups_inventory to load vars for managed-node2 19285 1727203939.94527: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203939.94532: Calling all_plugins_play to load vars for managed-node2 19285 1727203939.94534: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203939.94537: Calling groups_plugins_play to load vars for managed-node2 19285 1727203939.96121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203939.99348: done with get_vars() 19285 1727203939.99606: done queuing things up, now waiting for results queue to drain 19285 1727203939.99613: results queue empty 19285 1727203939.99614: checking for any_errors_fatal 19285 1727203939.99618: done checking for any_errors_fatal 19285 1727203939.99618: checking for max_fail_percentage 19285 1727203939.99619: done checking for max_fail_percentage 19285 1727203939.99620: checking to see if all hosts have failed and the running result is not ok 19285 1727203939.99621: done checking to see if all hosts have failed 19285 1727203939.99621: getting the remaining hosts for this loop 19285 1727203939.99622: done getting the remaining hosts for this loop 19285 1727203939.99625: getting the next task for host managed-node2 19285 1727203939.99628: done getting next task for host managed-node2 19285 1727203939.99631: ^ task is: TASK: Include the task '{{ task }}' 19285 1727203939.99632: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203939.99634: getting variables 19285 1727203939.99635: in VariableManager get_vars() 19285 1727203939.99644: Calling all_inventory to load vars for managed-node2 19285 1727203939.99647: Calling groups_inventory to load vars for managed-node2 19285 1727203939.99649: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203939.99654: Calling all_plugins_play to load vars for managed-node2 19285 1727203939.99657: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203939.99662: Calling groups_plugins_play to load vars for managed-node2 19285 1727203940.04134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203940.07839: done with get_vars() 19285 1727203940.07869: done getting variables 19285 1727203940.08483: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_profile_absent.yml'] ********************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Tuesday 24 September 2024 14:52:20 -0400 (0:00:01.182) 0:00:39.159 ***** 19285 1727203940.08513: entering _queue_task() for managed-node2/include_tasks 19285 1727203940.09657: worker is 1 (out of 1 available) 19285 1727203940.09675: exiting _queue_task() for managed-node2/include_tasks 19285 1727203940.09690: done queuing things up, now waiting for results queue to drain 19285 1727203940.09691: waiting for pending results... 19285 1727203940.10395: running TaskExecutor() for managed-node2/TASK: Include the task 'tasks/assert_profile_absent.yml' 19285 1727203940.10426: in run() - task 028d2410-947f-f31b-fb3f-000000000073 19285 1727203940.10448: variable 'ansible_search_path' from source: unknown 19285 1727203940.10582: calling self._execute() 19285 1727203940.10792: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203940.10973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203940.10979: variable 'omit' from source: magic vars 19285 1727203940.11648: variable 'ansible_distribution_major_version' from source: facts 19285 1727203940.11669: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203940.11685: variable 'task' from source: play vars 19285 1727203940.12093: variable 'task' from source: play vars 19285 1727203940.12105: _execute() done 19285 1727203940.12114: dumping result to json 19285 1727203940.12121: done dumping result, returning 19285 1727203940.12130: done running TaskExecutor() for managed-node2/TASK: Include the task 'tasks/assert_profile_absent.yml' [028d2410-947f-f31b-fb3f-000000000073] 19285 1727203940.12138: sending task result for task 028d2410-947f-f31b-fb3f-000000000073 19285 1727203940.12442: no more pending results, returning what we have 19285 1727203940.12448: in VariableManager get_vars() 19285 1727203940.12490: Calling all_inventory to load vars for managed-node2 19285 1727203940.12493: Calling groups_inventory to load vars for managed-node2 19285 1727203940.12498: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203940.12512: Calling all_plugins_play to load vars for managed-node2 19285 1727203940.12515: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203940.12520: Calling groups_plugins_play to load vars for managed-node2 19285 1727203940.13310: done sending task result for task 028d2410-947f-f31b-fb3f-000000000073 19285 1727203940.13314: WORKER PROCESS EXITING 19285 1727203940.16063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203940.19458: done with get_vars() 19285 1727203940.19499: variable 'ansible_search_path' from source: unknown 19285 1727203940.19516: we have included files to process 19285 1727203940.19517: generating all_blocks data 19285 1727203940.19518: done generating all_blocks data 19285 1727203940.19519: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 19285 1727203940.19521: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 19285 1727203940.19523: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 19285 1727203940.19746: in VariableManager get_vars() 19285 1727203940.19773: done with get_vars() 19285 1727203940.19891: done processing included file 19285 1727203940.19893: iterating over new_blocks loaded from include file 19285 1727203940.19895: in VariableManager get_vars() 19285 1727203940.19905: done with get_vars() 19285 1727203940.19907: filtering new block on tags 19285 1727203940.19930: done filtering new block on tags 19285 1727203940.19933: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed-node2 19285 1727203940.19938: extending task lists for all hosts with included blocks 19285 1727203940.19969: done extending task lists 19285 1727203940.19970: done processing included files 19285 1727203940.19970: results queue empty 19285 1727203940.19971: checking for any_errors_fatal 19285 1727203940.19973: done checking for any_errors_fatal 19285 1727203940.19973: checking for max_fail_percentage 19285 1727203940.19974: done checking for max_fail_percentage 19285 1727203940.19977: checking to see if all hosts have failed and the running result is not ok 19285 1727203940.19977: done checking to see if all hosts have failed 19285 1727203940.19978: getting the remaining hosts for this loop 19285 1727203940.19979: done getting the remaining hosts for this loop 19285 1727203940.19982: getting the next task for host managed-node2 19285 1727203940.19985: done getting next task for host managed-node2 19285 1727203940.19988: ^ task is: TASK: Include the task 'get_profile_stat.yml' 19285 1727203940.19997: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203940.19999: getting variables 19285 1727203940.20000: in VariableManager get_vars() 19285 1727203940.20009: Calling all_inventory to load vars for managed-node2 19285 1727203940.20011: Calling groups_inventory to load vars for managed-node2 19285 1727203940.20014: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203940.20027: Calling all_plugins_play to load vars for managed-node2 19285 1727203940.20037: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203940.20041: Calling groups_plugins_play to load vars for managed-node2 19285 1727203940.22063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203940.25431: done with get_vars() 19285 1727203940.25462: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Tuesday 24 September 2024 14:52:20 -0400 (0:00:00.170) 0:00:39.330 ***** 19285 1727203940.25612: entering _queue_task() for managed-node2/include_tasks 19285 1727203940.26395: worker is 1 (out of 1 available) 19285 1727203940.26404: exiting _queue_task() for managed-node2/include_tasks 19285 1727203940.26417: done queuing things up, now waiting for results queue to drain 19285 1727203940.26419: waiting for pending results... 19285 1727203940.26993: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 19285 1727203940.26998: in run() - task 028d2410-947f-f31b-fb3f-00000000047f 19285 1727203940.27001: variable 'ansible_search_path' from source: unknown 19285 1727203940.27004: variable 'ansible_search_path' from source: unknown 19285 1727203940.27382: calling self._execute() 19285 1727203940.27386: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203940.27389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203940.27393: variable 'omit' from source: magic vars 19285 1727203940.27989: variable 'ansible_distribution_major_version' from source: facts 19285 1727203940.28011: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203940.28023: _execute() done 19285 1727203940.28036: dumping result to json 19285 1727203940.28043: done dumping result, returning 19285 1727203940.28053: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [028d2410-947f-f31b-fb3f-00000000047f] 19285 1727203940.28063: sending task result for task 028d2410-947f-f31b-fb3f-00000000047f 19285 1727203940.28163: done sending task result for task 028d2410-947f-f31b-fb3f-00000000047f 19285 1727203940.28170: WORKER PROCESS EXITING 19285 1727203940.28204: no more pending results, returning what we have 19285 1727203940.28210: in VariableManager get_vars() 19285 1727203940.28244: Calling all_inventory to load vars for managed-node2 19285 1727203940.28247: Calling groups_inventory to load vars for managed-node2 19285 1727203940.28250: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203940.28268: Calling all_plugins_play to load vars for managed-node2 19285 1727203940.28270: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203940.28273: Calling groups_plugins_play to load vars for managed-node2 19285 1727203940.31439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203940.33092: done with get_vars() 19285 1727203940.33123: variable 'ansible_search_path' from source: unknown 19285 1727203940.33124: variable 'ansible_search_path' from source: unknown 19285 1727203940.33134: variable 'task' from source: play vars 19285 1727203940.33252: variable 'task' from source: play vars 19285 1727203940.33292: we have included files to process 19285 1727203940.33293: generating all_blocks data 19285 1727203940.33295: done generating all_blocks data 19285 1727203940.33296: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 19285 1727203940.33297: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 19285 1727203940.33299: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 19285 1727203940.34290: done processing included file 19285 1727203940.34292: iterating over new_blocks loaded from include file 19285 1727203940.34293: in VariableManager get_vars() 19285 1727203940.34315: done with get_vars() 19285 1727203940.34317: filtering new block on tags 19285 1727203940.34341: done filtering new block on tags 19285 1727203940.34344: in VariableManager get_vars() 19285 1727203940.34355: done with get_vars() 19285 1727203940.34357: filtering new block on tags 19285 1727203940.34381: done filtering new block on tags 19285 1727203940.34384: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 19285 1727203940.34389: extending task lists for all hosts with included blocks 19285 1727203940.34499: done extending task lists 19285 1727203940.34501: done processing included files 19285 1727203940.34502: results queue empty 19285 1727203940.34502: checking for any_errors_fatal 19285 1727203940.34506: done checking for any_errors_fatal 19285 1727203940.34507: checking for max_fail_percentage 19285 1727203940.34508: done checking for max_fail_percentage 19285 1727203940.34509: checking to see if all hosts have failed and the running result is not ok 19285 1727203940.34510: done checking to see if all hosts have failed 19285 1727203940.34510: getting the remaining hosts for this loop 19285 1727203940.34512: done getting the remaining hosts for this loop 19285 1727203940.34514: getting the next task for host managed-node2 19285 1727203940.34518: done getting next task for host managed-node2 19285 1727203940.34527: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 19285 1727203940.34530: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203940.34532: getting variables 19285 1727203940.34533: in VariableManager get_vars() 19285 1727203940.34542: Calling all_inventory to load vars for managed-node2 19285 1727203940.34544: Calling groups_inventory to load vars for managed-node2 19285 1727203940.34546: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203940.34552: Calling all_plugins_play to load vars for managed-node2 19285 1727203940.34554: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203940.34557: Calling groups_plugins_play to load vars for managed-node2 19285 1727203940.36331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203940.49534: done with get_vars() 19285 1727203940.49564: done getting variables 19285 1727203940.49606: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:52:20 -0400 (0:00:00.240) 0:00:39.570 ***** 19285 1727203940.49630: entering _queue_task() for managed-node2/set_fact 19285 1727203940.50486: worker is 1 (out of 1 available) 19285 1727203940.50500: exiting _queue_task() for managed-node2/set_fact 19285 1727203940.50681: done queuing things up, now waiting for results queue to drain 19285 1727203940.50684: waiting for pending results... 19285 1727203940.50911: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 19285 1727203940.51082: in run() - task 028d2410-947f-f31b-fb3f-00000000048a 19285 1727203940.51087: variable 'ansible_search_path' from source: unknown 19285 1727203940.51089: variable 'ansible_search_path' from source: unknown 19285 1727203940.51092: calling self._execute() 19285 1727203940.51281: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203940.51286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203940.51289: variable 'omit' from source: magic vars 19285 1727203940.51607: variable 'ansible_distribution_major_version' from source: facts 19285 1727203940.51618: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203940.51624: variable 'omit' from source: magic vars 19285 1727203940.51671: variable 'omit' from source: magic vars 19285 1727203940.51715: variable 'omit' from source: magic vars 19285 1727203940.51821: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203940.51824: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203940.51828: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203940.51900: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203940.51903: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203940.51906: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203940.51909: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203940.51911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203940.52000: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203940.52005: Set connection var ansible_pipelining to False 19285 1727203940.52122: Set connection var ansible_timeout to 10 19285 1727203940.52125: Set connection var ansible_shell_type to sh 19285 1727203940.52128: Set connection var ansible_shell_executable to /bin/sh 19285 1727203940.52130: Set connection var ansible_connection to ssh 19285 1727203940.52132: variable 'ansible_shell_executable' from source: unknown 19285 1727203940.52134: variable 'ansible_connection' from source: unknown 19285 1727203940.52137: variable 'ansible_module_compression' from source: unknown 19285 1727203940.52140: variable 'ansible_shell_type' from source: unknown 19285 1727203940.52143: variable 'ansible_shell_executable' from source: unknown 19285 1727203940.52145: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203940.52148: variable 'ansible_pipelining' from source: unknown 19285 1727203940.52151: variable 'ansible_timeout' from source: unknown 19285 1727203940.52154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203940.52231: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203940.52239: variable 'omit' from source: magic vars 19285 1727203940.52245: starting attempt loop 19285 1727203940.52254: running the handler 19285 1727203940.52272: handler run complete 19285 1727203940.52283: attempt loop complete, returning result 19285 1727203940.52287: _execute() done 19285 1727203940.52289: dumping result to json 19285 1727203940.52292: done dumping result, returning 19285 1727203940.52298: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [028d2410-947f-f31b-fb3f-00000000048a] 19285 1727203940.52304: sending task result for task 028d2410-947f-f31b-fb3f-00000000048a ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 19285 1727203940.52610: no more pending results, returning what we have 19285 1727203940.52613: results queue empty 19285 1727203940.52614: checking for any_errors_fatal 19285 1727203940.52615: done checking for any_errors_fatal 19285 1727203940.52616: checking for max_fail_percentage 19285 1727203940.52617: done checking for max_fail_percentage 19285 1727203940.52618: checking to see if all hosts have failed and the running result is not ok 19285 1727203940.52619: done checking to see if all hosts have failed 19285 1727203940.52620: getting the remaining hosts for this loop 19285 1727203940.52621: done getting the remaining hosts for this loop 19285 1727203940.52624: getting the next task for host managed-node2 19285 1727203940.52630: done getting next task for host managed-node2 19285 1727203940.52632: ^ task is: TASK: Stat profile file 19285 1727203940.52636: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203940.52639: getting variables 19285 1727203940.52640: in VariableManager get_vars() 19285 1727203940.52667: Calling all_inventory to load vars for managed-node2 19285 1727203940.52671: Calling groups_inventory to load vars for managed-node2 19285 1727203940.52674: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203940.52686: Calling all_plugins_play to load vars for managed-node2 19285 1727203940.52688: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203940.52691: Calling groups_plugins_play to load vars for managed-node2 19285 1727203940.53325: done sending task result for task 028d2410-947f-f31b-fb3f-00000000048a 19285 1727203940.53331: WORKER PROCESS EXITING 19285 1727203940.55067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203940.57041: done with get_vars() 19285 1727203940.57073: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:52:20 -0400 (0:00:00.077) 0:00:39.647 ***** 19285 1727203940.57352: entering _queue_task() for managed-node2/stat 19285 1727203940.58213: worker is 1 (out of 1 available) 19285 1727203940.58228: exiting _queue_task() for managed-node2/stat 19285 1727203940.58242: done queuing things up, now waiting for results queue to drain 19285 1727203940.58243: waiting for pending results... 19285 1727203940.58797: running TaskExecutor() for managed-node2/TASK: Stat profile file 19285 1727203940.59117: in run() - task 028d2410-947f-f31b-fb3f-00000000048b 19285 1727203940.59180: variable 'ansible_search_path' from source: unknown 19285 1727203940.59204: variable 'ansible_search_path' from source: unknown 19285 1727203940.59247: calling self._execute() 19285 1727203940.59418: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203940.59422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203940.59430: variable 'omit' from source: magic vars 19285 1727203940.59831: variable 'ansible_distribution_major_version' from source: facts 19285 1727203940.59849: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203940.59862: variable 'omit' from source: magic vars 19285 1727203940.59916: variable 'omit' from source: magic vars 19285 1727203940.60020: variable 'profile' from source: play vars 19285 1727203940.60031: variable 'interface' from source: set_fact 19285 1727203940.60102: variable 'interface' from source: set_fact 19285 1727203940.60127: variable 'omit' from source: magic vars 19285 1727203940.60180: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203940.60225: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203940.60260: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203940.60480: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203940.60483: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203940.60486: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203940.60488: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203940.60490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203940.60492: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203940.60494: Set connection var ansible_pipelining to False 19285 1727203940.60496: Set connection var ansible_timeout to 10 19285 1727203940.60498: Set connection var ansible_shell_type to sh 19285 1727203940.60500: Set connection var ansible_shell_executable to /bin/sh 19285 1727203940.60502: Set connection var ansible_connection to ssh 19285 1727203940.60503: variable 'ansible_shell_executable' from source: unknown 19285 1727203940.60505: variable 'ansible_connection' from source: unknown 19285 1727203940.60507: variable 'ansible_module_compression' from source: unknown 19285 1727203940.60510: variable 'ansible_shell_type' from source: unknown 19285 1727203940.60513: variable 'ansible_shell_executable' from source: unknown 19285 1727203940.60515: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203940.60526: variable 'ansible_pipelining' from source: unknown 19285 1727203940.60533: variable 'ansible_timeout' from source: unknown 19285 1727203940.60541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203940.60760: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19285 1727203940.60777: variable 'omit' from source: magic vars 19285 1727203940.60788: starting attempt loop 19285 1727203940.60794: running the handler 19285 1727203940.60812: _low_level_execute_command(): starting 19285 1727203940.60824: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203940.61610: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203940.61624: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203940.61707: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203940.61742: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203940.61817: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203940.63508: stdout chunk (state=3): >>>/root <<< 19285 1727203940.63625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203940.63649: stderr chunk (state=3): >>><<< 19285 1727203940.63662: stdout chunk (state=3): >>><<< 19285 1727203940.63722: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203940.63726: _low_level_execute_command(): starting 19285 1727203940.63730: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203940.6368098-22390-216036428344742 `" && echo ansible-tmp-1727203940.6368098-22390-216036428344742="` echo /root/.ansible/tmp/ansible-tmp-1727203940.6368098-22390-216036428344742 `" ) && sleep 0' 19285 1727203940.64605: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203940.64714: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203940.64717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203940.64734: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203940.64817: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203940.64838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203940.64841: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203940.64871: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203940.65006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203940.66909: stdout chunk (state=3): >>>ansible-tmp-1727203940.6368098-22390-216036428344742=/root/.ansible/tmp/ansible-tmp-1727203940.6368098-22390-216036428344742 <<< 19285 1727203940.67032: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203940.67083: stderr chunk (state=3): >>><<< 19285 1727203940.67086: stdout chunk (state=3): >>><<< 19285 1727203940.67109: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203940.6368098-22390-216036428344742=/root/.ansible/tmp/ansible-tmp-1727203940.6368098-22390-216036428344742 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203940.67182: variable 'ansible_module_compression' from source: unknown 19285 1727203940.67380: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 19285 1727203940.67383: variable 'ansible_facts' from source: unknown 19285 1727203940.67386: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203940.6368098-22390-216036428344742/AnsiballZ_stat.py 19285 1727203940.67518: Sending initial data 19285 1727203940.67526: Sent initial data (153 bytes) 19285 1727203940.68093: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203940.68170: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203940.68217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203940.68232: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203940.68253: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203940.68497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203940.70012: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203940.70120: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203940.70197: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmpmisfn0aq /root/.ansible/tmp/ansible-tmp-1727203940.6368098-22390-216036428344742/AnsiballZ_stat.py <<< 19285 1727203940.70229: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203940.6368098-22390-216036428344742/AnsiballZ_stat.py" <<< 19285 1727203940.70284: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmpmisfn0aq" to remote "/root/.ansible/tmp/ansible-tmp-1727203940.6368098-22390-216036428344742/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203940.6368098-22390-216036428344742/AnsiballZ_stat.py" <<< 19285 1727203940.71743: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203940.71747: stdout chunk (state=3): >>><<< 19285 1727203940.71749: stderr chunk (state=3): >>><<< 19285 1727203940.71751: done transferring module to remote 19285 1727203940.71753: _low_level_execute_command(): starting 19285 1727203940.71755: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203940.6368098-22390-216036428344742/ /root/.ansible/tmp/ansible-tmp-1727203940.6368098-22390-216036428344742/AnsiballZ_stat.py && sleep 0' 19285 1727203940.72544: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203940.72550: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203940.72613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203940.72640: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203940.72667: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203940.72754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203940.74647: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203940.74651: stdout chunk (state=3): >>><<< 19285 1727203940.74655: stderr chunk (state=3): >>><<< 19285 1727203940.74863: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203940.74868: _low_level_execute_command(): starting 19285 1727203940.74877: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203940.6368098-22390-216036428344742/AnsiballZ_stat.py && sleep 0' 19285 1727203940.76877: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203940.76882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203940.76889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 19285 1727203940.76892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203940.76896: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 19285 1727203940.76899: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203940.76901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 19285 1727203940.76903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203940.77297: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203940.77302: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203940.77305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203940.77414: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203940.92609: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 19285 1727203940.93840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203940.93869: stderr chunk (state=3): >>><<< 19285 1727203940.93873: stdout chunk (state=3): >>><<< 19285 1727203940.93916: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203940.93940: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203940.6368098-22390-216036428344742/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203940.93947: _low_level_execute_command(): starting 19285 1727203940.93952: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203940.6368098-22390-216036428344742/ > /dev/null 2>&1 && sleep 0' 19285 1727203940.94585: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203940.94590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 19285 1727203940.94594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203940.94596: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203940.94598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203940.94605: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203940.94668: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203940.94672: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203940.94678: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203940.94774: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203940.96609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203940.96638: stderr chunk (state=3): >>><<< 19285 1727203940.96642: stdout chunk (state=3): >>><<< 19285 1727203940.96657: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203940.96683: handler run complete 19285 1727203940.96699: attempt loop complete, returning result 19285 1727203940.96702: _execute() done 19285 1727203940.96704: dumping result to json 19285 1727203940.96706: done dumping result, returning 19285 1727203940.96711: done running TaskExecutor() for managed-node2/TASK: Stat profile file [028d2410-947f-f31b-fb3f-00000000048b] 19285 1727203940.96720: sending task result for task 028d2410-947f-f31b-fb3f-00000000048b 19285 1727203940.96813: done sending task result for task 028d2410-947f-f31b-fb3f-00000000048b 19285 1727203940.96816: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 19285 1727203940.96884: no more pending results, returning what we have 19285 1727203940.96887: results queue empty 19285 1727203940.96888: checking for any_errors_fatal 19285 1727203940.96897: done checking for any_errors_fatal 19285 1727203940.96897: checking for max_fail_percentage 19285 1727203940.96899: done checking for max_fail_percentage 19285 1727203940.96900: checking to see if all hosts have failed and the running result is not ok 19285 1727203940.96901: done checking to see if all hosts have failed 19285 1727203940.96902: getting the remaining hosts for this loop 19285 1727203940.96903: done getting the remaining hosts for this loop 19285 1727203940.96907: getting the next task for host managed-node2 19285 1727203940.96914: done getting next task for host managed-node2 19285 1727203940.96916: ^ task is: TASK: Set NM profile exist flag based on the profile files 19285 1727203940.96920: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203940.96923: getting variables 19285 1727203940.96933: in VariableManager get_vars() 19285 1727203940.96962: Calling all_inventory to load vars for managed-node2 19285 1727203940.96965: Calling groups_inventory to load vars for managed-node2 19285 1727203940.96969: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203940.96981: Calling all_plugins_play to load vars for managed-node2 19285 1727203940.96983: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203940.96986: Calling groups_plugins_play to load vars for managed-node2 19285 1727203940.97943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203940.98815: done with get_vars() 19285 1727203940.98832: done getting variables 19285 1727203940.98881: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:52:20 -0400 (0:00:00.415) 0:00:40.063 ***** 19285 1727203940.98910: entering _queue_task() for managed-node2/set_fact 19285 1727203940.99168: worker is 1 (out of 1 available) 19285 1727203940.99183: exiting _queue_task() for managed-node2/set_fact 19285 1727203940.99196: done queuing things up, now waiting for results queue to drain 19285 1727203940.99198: waiting for pending results... 19285 1727203940.99373: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 19285 1727203940.99466: in run() - task 028d2410-947f-f31b-fb3f-00000000048c 19285 1727203940.99477: variable 'ansible_search_path' from source: unknown 19285 1727203940.99481: variable 'ansible_search_path' from source: unknown 19285 1727203940.99508: calling self._execute() 19285 1727203940.99585: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203940.99589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203940.99597: variable 'omit' from source: magic vars 19285 1727203940.99869: variable 'ansible_distribution_major_version' from source: facts 19285 1727203940.99877: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203940.99959: variable 'profile_stat' from source: set_fact 19285 1727203940.99971: Evaluated conditional (profile_stat.stat.exists): False 19285 1727203940.99974: when evaluation is False, skipping this task 19285 1727203940.99978: _execute() done 19285 1727203940.99981: dumping result to json 19285 1727203940.99984: done dumping result, returning 19285 1727203940.99986: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [028d2410-947f-f31b-fb3f-00000000048c] 19285 1727203940.99996: sending task result for task 028d2410-947f-f31b-fb3f-00000000048c 19285 1727203941.00079: done sending task result for task 028d2410-947f-f31b-fb3f-00000000048c 19285 1727203941.00083: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 19285 1727203941.00141: no more pending results, returning what we have 19285 1727203941.00145: results queue empty 19285 1727203941.00146: checking for any_errors_fatal 19285 1727203941.00158: done checking for any_errors_fatal 19285 1727203941.00158: checking for max_fail_percentage 19285 1727203941.00163: done checking for max_fail_percentage 19285 1727203941.00163: checking to see if all hosts have failed and the running result is not ok 19285 1727203941.00164: done checking to see if all hosts have failed 19285 1727203941.00165: getting the remaining hosts for this loop 19285 1727203941.00166: done getting the remaining hosts for this loop 19285 1727203941.00170: getting the next task for host managed-node2 19285 1727203941.00181: done getting next task for host managed-node2 19285 1727203941.00184: ^ task is: TASK: Get NM profile info 19285 1727203941.00189: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203941.00194: getting variables 19285 1727203941.00195: in VariableManager get_vars() 19285 1727203941.00227: Calling all_inventory to load vars for managed-node2 19285 1727203941.00230: Calling groups_inventory to load vars for managed-node2 19285 1727203941.00233: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203941.00243: Calling all_plugins_play to load vars for managed-node2 19285 1727203941.00245: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203941.00247: Calling groups_plugins_play to load vars for managed-node2 19285 1727203941.01107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203941.02308: done with get_vars() 19285 1727203941.02334: done getting variables 19285 1727203941.02398: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:52:21 -0400 (0:00:00.035) 0:00:40.098 ***** 19285 1727203941.02420: entering _queue_task() for managed-node2/shell 19285 1727203941.02719: worker is 1 (out of 1 available) 19285 1727203941.02734: exiting _queue_task() for managed-node2/shell 19285 1727203941.02747: done queuing things up, now waiting for results queue to drain 19285 1727203941.02749: waiting for pending results... 19285 1727203941.02953: running TaskExecutor() for managed-node2/TASK: Get NM profile info 19285 1727203941.03086: in run() - task 028d2410-947f-f31b-fb3f-00000000048d 19285 1727203941.03167: variable 'ansible_search_path' from source: unknown 19285 1727203941.03171: variable 'ansible_search_path' from source: unknown 19285 1727203941.03173: calling self._execute() 19285 1727203941.03244: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203941.03250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203941.03259: variable 'omit' from source: magic vars 19285 1727203941.03810: variable 'ansible_distribution_major_version' from source: facts 19285 1727203941.03814: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203941.03817: variable 'omit' from source: magic vars 19285 1727203941.03819: variable 'omit' from source: magic vars 19285 1727203941.03821: variable 'profile' from source: play vars 19285 1727203941.03824: variable 'interface' from source: set_fact 19285 1727203941.03850: variable 'interface' from source: set_fact 19285 1727203941.03879: variable 'omit' from source: magic vars 19285 1727203941.03946: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203941.03987: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203941.04007: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203941.04027: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203941.04040: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203941.04182: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203941.04188: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203941.04191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203941.04193: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203941.04214: Set connection var ansible_pipelining to False 19285 1727203941.04220: Set connection var ansible_timeout to 10 19285 1727203941.04222: Set connection var ansible_shell_type to sh 19285 1727203941.04228: Set connection var ansible_shell_executable to /bin/sh 19285 1727203941.04232: Set connection var ansible_connection to ssh 19285 1727203941.04254: variable 'ansible_shell_executable' from source: unknown 19285 1727203941.04257: variable 'ansible_connection' from source: unknown 19285 1727203941.04268: variable 'ansible_module_compression' from source: unknown 19285 1727203941.04271: variable 'ansible_shell_type' from source: unknown 19285 1727203941.04273: variable 'ansible_shell_executable' from source: unknown 19285 1727203941.04277: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203941.04279: variable 'ansible_pipelining' from source: unknown 19285 1727203941.04282: variable 'ansible_timeout' from source: unknown 19285 1727203941.04292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203941.04431: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203941.04491: variable 'omit' from source: magic vars 19285 1727203941.04494: starting attempt loop 19285 1727203941.04497: running the handler 19285 1727203941.04500: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203941.04503: _low_level_execute_command(): starting 19285 1727203941.04505: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203941.05178: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203941.05184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 19285 1727203941.05189: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203941.05201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203941.05230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203941.05239: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203941.05334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203941.07050: stdout chunk (state=3): >>>/root <<< 19285 1727203941.07154: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203941.07184: stderr chunk (state=3): >>><<< 19285 1727203941.07188: stdout chunk (state=3): >>><<< 19285 1727203941.07210: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203941.07222: _low_level_execute_command(): starting 19285 1727203941.07228: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203941.07209-22411-77254594185872 `" && echo ansible-tmp-1727203941.07209-22411-77254594185872="` echo /root/.ansible/tmp/ansible-tmp-1727203941.07209-22411-77254594185872 `" ) && sleep 0' 19285 1727203941.07651: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203941.07666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203941.07689: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203941.07693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203941.07738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203941.07741: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203941.07751: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203941.07831: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203941.09839: stdout chunk (state=3): >>>ansible-tmp-1727203941.07209-22411-77254594185872=/root/.ansible/tmp/ansible-tmp-1727203941.07209-22411-77254594185872 <<< 19285 1727203941.10066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203941.10070: stdout chunk (state=3): >>><<< 19285 1727203941.10072: stderr chunk (state=3): >>><<< 19285 1727203941.10077: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203941.07209-22411-77254594185872=/root/.ansible/tmp/ansible-tmp-1727203941.07209-22411-77254594185872 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203941.10095: variable 'ansible_module_compression' from source: unknown 19285 1727203941.10152: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 19285 1727203941.10245: variable 'ansible_facts' from source: unknown 19285 1727203941.10481: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203941.07209-22411-77254594185872/AnsiballZ_command.py 19285 1727203941.10886: Sending initial data 19285 1727203941.10892: Sent initial data (153 bytes) 19285 1727203941.11326: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203941.11333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203941.11405: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203941.11418: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203941.11440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203941.11518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203941.13142: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203941.13234: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203941.13304: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmpjbkqdq4w /root/.ansible/tmp/ansible-tmp-1727203941.07209-22411-77254594185872/AnsiballZ_command.py <<< 19285 1727203941.13307: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203941.07209-22411-77254594185872/AnsiballZ_command.py" <<< 19285 1727203941.13374: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmpjbkqdq4w" to remote "/root/.ansible/tmp/ansible-tmp-1727203941.07209-22411-77254594185872/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203941.07209-22411-77254594185872/AnsiballZ_command.py" <<< 19285 1727203941.14783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203941.14796: stdout chunk (state=3): >>><<< 19285 1727203941.14801: stderr chunk (state=3): >>><<< 19285 1727203941.14931: done transferring module to remote 19285 1727203941.14940: _low_level_execute_command(): starting 19285 1727203941.14945: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203941.07209-22411-77254594185872/ /root/.ansible/tmp/ansible-tmp-1727203941.07209-22411-77254594185872/AnsiballZ_command.py && sleep 0' 19285 1727203941.15611: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203941.15651: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203941.15654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203941.15657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203941.15678: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203941.15714: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203941.15754: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203941.15890: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203941.15893: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203941.15991: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203941.17741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203941.17759: stderr chunk (state=3): >>><<< 19285 1727203941.17762: stdout chunk (state=3): >>><<< 19285 1727203941.17779: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203941.17782: _low_level_execute_command(): starting 19285 1727203941.17787: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203941.07209-22411-77254594185872/AnsiballZ_command.py && sleep 0' 19285 1727203941.18222: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203941.18225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 19285 1727203941.18228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 19285 1727203941.18230: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203941.18232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203941.18290: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203941.18294: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203941.18373: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203941.34952: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-24 14:52:21.332677", "end": "2024-09-24 14:52:21.348508", "delta": "0:00:00.015831", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 19285 1727203941.36394: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.13.254 closed. <<< 19285 1727203941.36428: stderr chunk (state=3): >>><<< 19285 1727203941.36432: stdout chunk (state=3): >>><<< 19285 1727203941.36448: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-24 14:52:21.332677", "end": "2024-09-24 14:52:21.348508", "delta": "0:00:00.015831", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.13.254 closed. 19285 1727203941.36479: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203941.07209-22411-77254594185872/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203941.36486: _low_level_execute_command(): starting 19285 1727203941.36494: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203941.07209-22411-77254594185872/ > /dev/null 2>&1 && sleep 0' 19285 1727203941.36937: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203941.36981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203941.36985: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 19285 1727203941.36987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 19285 1727203941.36989: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203941.36991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 19285 1727203941.36993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203941.37036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203941.37039: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203941.37117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203941.38952: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203941.38980: stderr chunk (state=3): >>><<< 19285 1727203941.38983: stdout chunk (state=3): >>><<< 19285 1727203941.38998: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203941.39004: handler run complete 19285 1727203941.39025: Evaluated conditional (False): False 19285 1727203941.39033: attempt loop complete, returning result 19285 1727203941.39035: _execute() done 19285 1727203941.39038: dumping result to json 19285 1727203941.39042: done dumping result, returning 19285 1727203941.39049: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [028d2410-947f-f31b-fb3f-00000000048d] 19285 1727203941.39054: sending task result for task 028d2410-947f-f31b-fb3f-00000000048d 19285 1727203941.39148: done sending task result for task 028d2410-947f-f31b-fb3f-00000000048d 19285 1727203941.39150: WORKER PROCESS EXITING fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "delta": "0:00:00.015831", "end": "2024-09-24 14:52:21.348508", "rc": 1, "start": "2024-09-24 14:52:21.332677" } MSG: non-zero return code ...ignoring 19285 1727203941.39225: no more pending results, returning what we have 19285 1727203941.39228: results queue empty 19285 1727203941.39229: checking for any_errors_fatal 19285 1727203941.39237: done checking for any_errors_fatal 19285 1727203941.39238: checking for max_fail_percentage 19285 1727203941.39239: done checking for max_fail_percentage 19285 1727203941.39240: checking to see if all hosts have failed and the running result is not ok 19285 1727203941.39241: done checking to see if all hosts have failed 19285 1727203941.39242: getting the remaining hosts for this loop 19285 1727203941.39243: done getting the remaining hosts for this loop 19285 1727203941.39246: getting the next task for host managed-node2 19285 1727203941.39254: done getting next task for host managed-node2 19285 1727203941.39257: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 19285 1727203941.39265: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203941.39269: getting variables 19285 1727203941.39270: in VariableManager get_vars() 19285 1727203941.39299: Calling all_inventory to load vars for managed-node2 19285 1727203941.39302: Calling groups_inventory to load vars for managed-node2 19285 1727203941.39306: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203941.39316: Calling all_plugins_play to load vars for managed-node2 19285 1727203941.39318: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203941.39321: Calling groups_plugins_play to load vars for managed-node2 19285 1727203941.40244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203941.41114: done with get_vars() 19285 1727203941.41130: done getting variables 19285 1727203941.41173: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:52:21 -0400 (0:00:00.387) 0:00:40.486 ***** 19285 1727203941.41199: entering _queue_task() for managed-node2/set_fact 19285 1727203941.41441: worker is 1 (out of 1 available) 19285 1727203941.41455: exiting _queue_task() for managed-node2/set_fact 19285 1727203941.41469: done queuing things up, now waiting for results queue to drain 19285 1727203941.41471: waiting for pending results... 19285 1727203941.41637: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 19285 1727203941.41726: in run() - task 028d2410-947f-f31b-fb3f-00000000048e 19285 1727203941.41737: variable 'ansible_search_path' from source: unknown 19285 1727203941.41740: variable 'ansible_search_path' from source: unknown 19285 1727203941.41769: calling self._execute() 19285 1727203941.41844: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203941.41848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203941.41858: variable 'omit' from source: magic vars 19285 1727203941.42130: variable 'ansible_distribution_major_version' from source: facts 19285 1727203941.42146: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203941.42229: variable 'nm_profile_exists' from source: set_fact 19285 1727203941.42243: Evaluated conditional (nm_profile_exists.rc == 0): False 19285 1727203941.42246: when evaluation is False, skipping this task 19285 1727203941.42249: _execute() done 19285 1727203941.42253: dumping result to json 19285 1727203941.42256: done dumping result, returning 19285 1727203941.42262: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [028d2410-947f-f31b-fb3f-00000000048e] 19285 1727203941.42265: sending task result for task 028d2410-947f-f31b-fb3f-00000000048e 19285 1727203941.42347: done sending task result for task 028d2410-947f-f31b-fb3f-00000000048e 19285 1727203941.42350: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 19285 1727203941.42418: no more pending results, returning what we have 19285 1727203941.42422: results queue empty 19285 1727203941.42423: checking for any_errors_fatal 19285 1727203941.42428: done checking for any_errors_fatal 19285 1727203941.42429: checking for max_fail_percentage 19285 1727203941.42430: done checking for max_fail_percentage 19285 1727203941.42431: checking to see if all hosts have failed and the running result is not ok 19285 1727203941.42431: done checking to see if all hosts have failed 19285 1727203941.42432: getting the remaining hosts for this loop 19285 1727203941.42434: done getting the remaining hosts for this loop 19285 1727203941.42437: getting the next task for host managed-node2 19285 1727203941.42444: done getting next task for host managed-node2 19285 1727203941.42446: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 19285 1727203941.42450: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203941.42453: getting variables 19285 1727203941.42454: in VariableManager get_vars() 19285 1727203941.42480: Calling all_inventory to load vars for managed-node2 19285 1727203941.42483: Calling groups_inventory to load vars for managed-node2 19285 1727203941.42486: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203941.42495: Calling all_plugins_play to load vars for managed-node2 19285 1727203941.42497: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203941.42499: Calling groups_plugins_play to load vars for managed-node2 19285 1727203941.43252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203941.44219: done with get_vars() 19285 1727203941.44233: done getting variables 19285 1727203941.44276: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19285 1727203941.44358: variable 'profile' from source: play vars 19285 1727203941.44362: variable 'interface' from source: set_fact 19285 1727203941.44405: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-LSR-TST-br31] ******************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:52:21 -0400 (0:00:00.032) 0:00:40.518 ***** 19285 1727203941.44427: entering _queue_task() for managed-node2/command 19285 1727203941.44634: worker is 1 (out of 1 available) 19285 1727203941.44648: exiting _queue_task() for managed-node2/command 19285 1727203941.44660: done queuing things up, now waiting for results queue to drain 19285 1727203941.44662: waiting for pending results... 19285 1727203941.44833: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 19285 1727203941.44927: in run() - task 028d2410-947f-f31b-fb3f-000000000490 19285 1727203941.44939: variable 'ansible_search_path' from source: unknown 19285 1727203941.44942: variable 'ansible_search_path' from source: unknown 19285 1727203941.44971: calling self._execute() 19285 1727203941.45044: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203941.45048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203941.45058: variable 'omit' from source: magic vars 19285 1727203941.45334: variable 'ansible_distribution_major_version' from source: facts 19285 1727203941.45338: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203941.45416: variable 'profile_stat' from source: set_fact 19285 1727203941.45428: Evaluated conditional (profile_stat.stat.exists): False 19285 1727203941.45432: when evaluation is False, skipping this task 19285 1727203941.45437: _execute() done 19285 1727203941.45439: dumping result to json 19285 1727203941.45442: done dumping result, returning 19285 1727203941.45445: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 [028d2410-947f-f31b-fb3f-000000000490] 19285 1727203941.45447: sending task result for task 028d2410-947f-f31b-fb3f-000000000490 19285 1727203941.45532: done sending task result for task 028d2410-947f-f31b-fb3f-000000000490 19285 1727203941.45534: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 19285 1727203941.45607: no more pending results, returning what we have 19285 1727203941.45610: results queue empty 19285 1727203941.45611: checking for any_errors_fatal 19285 1727203941.45616: done checking for any_errors_fatal 19285 1727203941.45616: checking for max_fail_percentage 19285 1727203941.45618: done checking for max_fail_percentage 19285 1727203941.45618: checking to see if all hosts have failed and the running result is not ok 19285 1727203941.45619: done checking to see if all hosts have failed 19285 1727203941.45620: getting the remaining hosts for this loop 19285 1727203941.45621: done getting the remaining hosts for this loop 19285 1727203941.45624: getting the next task for host managed-node2 19285 1727203941.45630: done getting next task for host managed-node2 19285 1727203941.45633: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 19285 1727203941.45636: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203941.45639: getting variables 19285 1727203941.45640: in VariableManager get_vars() 19285 1727203941.45663: Calling all_inventory to load vars for managed-node2 19285 1727203941.45667: Calling groups_inventory to load vars for managed-node2 19285 1727203941.45670: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203941.45680: Calling all_plugins_play to load vars for managed-node2 19285 1727203941.45683: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203941.45685: Calling groups_plugins_play to load vars for managed-node2 19285 1727203941.46439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203941.47317: done with get_vars() 19285 1727203941.47331: done getting variables 19285 1727203941.47377: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19285 1727203941.47448: variable 'profile' from source: play vars 19285 1727203941.47451: variable 'interface' from source: set_fact 19285 1727203941.47495: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-LSR-TST-br31] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:52:21 -0400 (0:00:00.030) 0:00:40.549 ***** 19285 1727203941.47518: entering _queue_task() for managed-node2/set_fact 19285 1727203941.47723: worker is 1 (out of 1 available) 19285 1727203941.47735: exiting _queue_task() for managed-node2/set_fact 19285 1727203941.47746: done queuing things up, now waiting for results queue to drain 19285 1727203941.47747: waiting for pending results... 19285 1727203941.47918: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 19285 1727203941.48005: in run() - task 028d2410-947f-f31b-fb3f-000000000491 19285 1727203941.48082: variable 'ansible_search_path' from source: unknown 19285 1727203941.48086: variable 'ansible_search_path' from source: unknown 19285 1727203941.48089: calling self._execute() 19285 1727203941.48117: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203941.48124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203941.48135: variable 'omit' from source: magic vars 19285 1727203941.48393: variable 'ansible_distribution_major_version' from source: facts 19285 1727203941.48403: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203941.48488: variable 'profile_stat' from source: set_fact 19285 1727203941.48498: Evaluated conditional (profile_stat.stat.exists): False 19285 1727203941.48502: when evaluation is False, skipping this task 19285 1727203941.48504: _execute() done 19285 1727203941.48507: dumping result to json 19285 1727203941.48509: done dumping result, returning 19285 1727203941.48515: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 [028d2410-947f-f31b-fb3f-000000000491] 19285 1727203941.48520: sending task result for task 028d2410-947f-f31b-fb3f-000000000491 19285 1727203941.48604: done sending task result for task 028d2410-947f-f31b-fb3f-000000000491 19285 1727203941.48606: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 19285 1727203941.48675: no more pending results, returning what we have 19285 1727203941.48680: results queue empty 19285 1727203941.48681: checking for any_errors_fatal 19285 1727203941.48685: done checking for any_errors_fatal 19285 1727203941.48686: checking for max_fail_percentage 19285 1727203941.48687: done checking for max_fail_percentage 19285 1727203941.48688: checking to see if all hosts have failed and the running result is not ok 19285 1727203941.48689: done checking to see if all hosts have failed 19285 1727203941.48690: getting the remaining hosts for this loop 19285 1727203941.48691: done getting the remaining hosts for this loop 19285 1727203941.48694: getting the next task for host managed-node2 19285 1727203941.48699: done getting next task for host managed-node2 19285 1727203941.48701: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 19285 1727203941.48705: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203941.48708: getting variables 19285 1727203941.48709: in VariableManager get_vars() 19285 1727203941.48734: Calling all_inventory to load vars for managed-node2 19285 1727203941.48736: Calling groups_inventory to load vars for managed-node2 19285 1727203941.48739: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203941.48748: Calling all_plugins_play to load vars for managed-node2 19285 1727203941.48750: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203941.48752: Calling groups_plugins_play to load vars for managed-node2 19285 1727203941.49613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203941.50468: done with get_vars() 19285 1727203941.50483: done getting variables 19285 1727203941.50525: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19285 1727203941.50598: variable 'profile' from source: play vars 19285 1727203941.50601: variable 'interface' from source: set_fact 19285 1727203941.50641: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-LSR-TST-br31] *********************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:52:21 -0400 (0:00:00.031) 0:00:40.581 ***** 19285 1727203941.50664: entering _queue_task() for managed-node2/command 19285 1727203941.50868: worker is 1 (out of 1 available) 19285 1727203941.50883: exiting _queue_task() for managed-node2/command 19285 1727203941.50895: done queuing things up, now waiting for results queue to drain 19285 1727203941.50897: waiting for pending results... 19285 1727203941.51061: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 19285 1727203941.51148: in run() - task 028d2410-947f-f31b-fb3f-000000000492 19285 1727203941.51158: variable 'ansible_search_path' from source: unknown 19285 1727203941.51162: variable 'ansible_search_path' from source: unknown 19285 1727203941.51192: calling self._execute() 19285 1727203941.51262: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203941.51270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203941.51278: variable 'omit' from source: magic vars 19285 1727203941.51528: variable 'ansible_distribution_major_version' from source: facts 19285 1727203941.51537: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203941.51623: variable 'profile_stat' from source: set_fact 19285 1727203941.51633: Evaluated conditional (profile_stat.stat.exists): False 19285 1727203941.51636: when evaluation is False, skipping this task 19285 1727203941.51639: _execute() done 19285 1727203941.51642: dumping result to json 19285 1727203941.51644: done dumping result, returning 19285 1727203941.51649: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 [028d2410-947f-f31b-fb3f-000000000492] 19285 1727203941.51654: sending task result for task 028d2410-947f-f31b-fb3f-000000000492 19285 1727203941.51736: done sending task result for task 028d2410-947f-f31b-fb3f-000000000492 19285 1727203941.51739: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 19285 1727203941.51821: no more pending results, returning what we have 19285 1727203941.51824: results queue empty 19285 1727203941.51825: checking for any_errors_fatal 19285 1727203941.51830: done checking for any_errors_fatal 19285 1727203941.51831: checking for max_fail_percentage 19285 1727203941.51833: done checking for max_fail_percentage 19285 1727203941.51833: checking to see if all hosts have failed and the running result is not ok 19285 1727203941.51834: done checking to see if all hosts have failed 19285 1727203941.51835: getting the remaining hosts for this loop 19285 1727203941.51836: done getting the remaining hosts for this loop 19285 1727203941.51839: getting the next task for host managed-node2 19285 1727203941.51844: done getting next task for host managed-node2 19285 1727203941.51847: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 19285 1727203941.51850: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203941.51854: getting variables 19285 1727203941.51855: in VariableManager get_vars() 19285 1727203941.51880: Calling all_inventory to load vars for managed-node2 19285 1727203941.51882: Calling groups_inventory to load vars for managed-node2 19285 1727203941.51886: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203941.51895: Calling all_plugins_play to load vars for managed-node2 19285 1727203941.51897: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203941.51899: Calling groups_plugins_play to load vars for managed-node2 19285 1727203941.52635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203941.53520: done with get_vars() 19285 1727203941.53536: done getting variables 19285 1727203941.53581: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19285 1727203941.53655: variable 'profile' from source: play vars 19285 1727203941.53658: variable 'interface' from source: set_fact 19285 1727203941.53700: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-LSR-TST-br31] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:52:21 -0400 (0:00:00.030) 0:00:40.611 ***** 19285 1727203941.53721: entering _queue_task() for managed-node2/set_fact 19285 1727203941.53936: worker is 1 (out of 1 available) 19285 1727203941.53948: exiting _queue_task() for managed-node2/set_fact 19285 1727203941.53964: done queuing things up, now waiting for results queue to drain 19285 1727203941.53966: waiting for pending results... 19285 1727203941.54129: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 19285 1727203941.54211: in run() - task 028d2410-947f-f31b-fb3f-000000000493 19285 1727203941.54222: variable 'ansible_search_path' from source: unknown 19285 1727203941.54225: variable 'ansible_search_path' from source: unknown 19285 1727203941.54253: calling self._execute() 19285 1727203941.54326: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203941.54330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203941.54339: variable 'omit' from source: magic vars 19285 1727203941.54591: variable 'ansible_distribution_major_version' from source: facts 19285 1727203941.54600: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203941.54686: variable 'profile_stat' from source: set_fact 19285 1727203941.54696: Evaluated conditional (profile_stat.stat.exists): False 19285 1727203941.54699: when evaluation is False, skipping this task 19285 1727203941.54702: _execute() done 19285 1727203941.54705: dumping result to json 19285 1727203941.54707: done dumping result, returning 19285 1727203941.54712: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 [028d2410-947f-f31b-fb3f-000000000493] 19285 1727203941.54718: sending task result for task 028d2410-947f-f31b-fb3f-000000000493 19285 1727203941.54802: done sending task result for task 028d2410-947f-f31b-fb3f-000000000493 19285 1727203941.54805: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 19285 1727203941.54884: no more pending results, returning what we have 19285 1727203941.54887: results queue empty 19285 1727203941.54887: checking for any_errors_fatal 19285 1727203941.54892: done checking for any_errors_fatal 19285 1727203941.54892: checking for max_fail_percentage 19285 1727203941.54894: done checking for max_fail_percentage 19285 1727203941.54894: checking to see if all hosts have failed and the running result is not ok 19285 1727203941.54895: done checking to see if all hosts have failed 19285 1727203941.54896: getting the remaining hosts for this loop 19285 1727203941.54897: done getting the remaining hosts for this loop 19285 1727203941.54900: getting the next task for host managed-node2 19285 1727203941.54906: done getting next task for host managed-node2 19285 1727203941.54908: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 19285 1727203941.54911: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203941.54915: getting variables 19285 1727203941.54916: in VariableManager get_vars() 19285 1727203941.54939: Calling all_inventory to load vars for managed-node2 19285 1727203941.54941: Calling groups_inventory to load vars for managed-node2 19285 1727203941.54944: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203941.54953: Calling all_plugins_play to load vars for managed-node2 19285 1727203941.54955: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203941.54957: Calling groups_plugins_play to load vars for managed-node2 19285 1727203941.55822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203941.56678: done with get_vars() 19285 1727203941.56692: done getting variables 19285 1727203941.56733: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19285 1727203941.56815: variable 'profile' from source: play vars 19285 1727203941.56818: variable 'interface' from source: set_fact 19285 1727203941.56856: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'LSR-TST-br31'] ********************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Tuesday 24 September 2024 14:52:21 -0400 (0:00:00.031) 0:00:40.643 ***** 19285 1727203941.56881: entering _queue_task() for managed-node2/assert 19285 1727203941.57089: worker is 1 (out of 1 available) 19285 1727203941.57104: exiting _queue_task() for managed-node2/assert 19285 1727203941.57116: done queuing things up, now waiting for results queue to drain 19285 1727203941.57118: waiting for pending results... 19285 1727203941.57274: running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'LSR-TST-br31' 19285 1727203941.57340: in run() - task 028d2410-947f-f31b-fb3f-000000000480 19285 1727203941.57358: variable 'ansible_search_path' from source: unknown 19285 1727203941.57365: variable 'ansible_search_path' from source: unknown 19285 1727203941.57385: calling self._execute() 19285 1727203941.57451: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203941.57456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203941.57463: variable 'omit' from source: magic vars 19285 1727203941.57711: variable 'ansible_distribution_major_version' from source: facts 19285 1727203941.57719: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203941.57725: variable 'omit' from source: magic vars 19285 1727203941.57750: variable 'omit' from source: magic vars 19285 1727203941.57824: variable 'profile' from source: play vars 19285 1727203941.57828: variable 'interface' from source: set_fact 19285 1727203941.57873: variable 'interface' from source: set_fact 19285 1727203941.57888: variable 'omit' from source: magic vars 19285 1727203941.57923: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203941.57950: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203941.57968: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203941.57983: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203941.57992: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203941.58023: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203941.58026: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203941.58029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203941.58090: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203941.58097: Set connection var ansible_pipelining to False 19285 1727203941.58102: Set connection var ansible_timeout to 10 19285 1727203941.58105: Set connection var ansible_shell_type to sh 19285 1727203941.58111: Set connection var ansible_shell_executable to /bin/sh 19285 1727203941.58114: Set connection var ansible_connection to ssh 19285 1727203941.58132: variable 'ansible_shell_executable' from source: unknown 19285 1727203941.58134: variable 'ansible_connection' from source: unknown 19285 1727203941.58137: variable 'ansible_module_compression' from source: unknown 19285 1727203941.58139: variable 'ansible_shell_type' from source: unknown 19285 1727203941.58141: variable 'ansible_shell_executable' from source: unknown 19285 1727203941.58144: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203941.58149: variable 'ansible_pipelining' from source: unknown 19285 1727203941.58152: variable 'ansible_timeout' from source: unknown 19285 1727203941.58155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203941.58255: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203941.58263: variable 'omit' from source: magic vars 19285 1727203941.58271: starting attempt loop 19285 1727203941.58274: running the handler 19285 1727203941.58358: variable 'lsr_net_profile_exists' from source: set_fact 19285 1727203941.58363: Evaluated conditional (not lsr_net_profile_exists): True 19285 1727203941.58367: handler run complete 19285 1727203941.58379: attempt loop complete, returning result 19285 1727203941.58381: _execute() done 19285 1727203941.58384: dumping result to json 19285 1727203941.58387: done dumping result, returning 19285 1727203941.58393: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'LSR-TST-br31' [028d2410-947f-f31b-fb3f-000000000480] 19285 1727203941.58398: sending task result for task 028d2410-947f-f31b-fb3f-000000000480 19285 1727203941.58473: done sending task result for task 028d2410-947f-f31b-fb3f-000000000480 19285 1727203941.58477: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 19285 1727203941.58522: no more pending results, returning what we have 19285 1727203941.58526: results queue empty 19285 1727203941.58526: checking for any_errors_fatal 19285 1727203941.58535: done checking for any_errors_fatal 19285 1727203941.58536: checking for max_fail_percentage 19285 1727203941.58537: done checking for max_fail_percentage 19285 1727203941.58538: checking to see if all hosts have failed and the running result is not ok 19285 1727203941.58540: done checking to see if all hosts have failed 19285 1727203941.58540: getting the remaining hosts for this loop 19285 1727203941.58542: done getting the remaining hosts for this loop 19285 1727203941.58546: getting the next task for host managed-node2 19285 1727203941.58553: done getting next task for host managed-node2 19285 1727203941.58555: ^ task is: TASK: meta (flush_handlers) 19285 1727203941.58557: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203941.58563: getting variables 19285 1727203941.58564: in VariableManager get_vars() 19285 1727203941.58591: Calling all_inventory to load vars for managed-node2 19285 1727203941.58593: Calling groups_inventory to load vars for managed-node2 19285 1727203941.58596: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203941.58605: Calling all_plugins_play to load vars for managed-node2 19285 1727203941.58607: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203941.58610: Calling groups_plugins_play to load vars for managed-node2 19285 1727203941.59365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203941.60237: done with get_vars() 19285 1727203941.60253: done getting variables 19285 1727203941.60303: in VariableManager get_vars() 19285 1727203941.60310: Calling all_inventory to load vars for managed-node2 19285 1727203941.60311: Calling groups_inventory to load vars for managed-node2 19285 1727203941.60313: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203941.60316: Calling all_plugins_play to load vars for managed-node2 19285 1727203941.60317: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203941.60319: Calling groups_plugins_play to load vars for managed-node2 19285 1727203941.61030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203941.61894: done with get_vars() 19285 1727203941.61910: done queuing things up, now waiting for results queue to drain 19285 1727203941.61912: results queue empty 19285 1727203941.61912: checking for any_errors_fatal 19285 1727203941.61914: done checking for any_errors_fatal 19285 1727203941.61914: checking for max_fail_percentage 19285 1727203941.61915: done checking for max_fail_percentage 19285 1727203941.61916: checking to see if all hosts have failed and the running result is not ok 19285 1727203941.61922: done checking to see if all hosts have failed 19285 1727203941.61923: getting the remaining hosts for this loop 19285 1727203941.61923: done getting the remaining hosts for this loop 19285 1727203941.61926: getting the next task for host managed-node2 19285 1727203941.61928: done getting next task for host managed-node2 19285 1727203941.61929: ^ task is: TASK: meta (flush_handlers) 19285 1727203941.61930: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203941.61932: getting variables 19285 1727203941.61932: in VariableManager get_vars() 19285 1727203941.61938: Calling all_inventory to load vars for managed-node2 19285 1727203941.61939: Calling groups_inventory to load vars for managed-node2 19285 1727203941.61940: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203941.61943: Calling all_plugins_play to load vars for managed-node2 19285 1727203941.61945: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203941.61946: Calling groups_plugins_play to load vars for managed-node2 19285 1727203941.62581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203941.63471: done with get_vars() 19285 1727203941.63486: done getting variables 19285 1727203941.63516: in VariableManager get_vars() 19285 1727203941.63521: Calling all_inventory to load vars for managed-node2 19285 1727203941.63523: Calling groups_inventory to load vars for managed-node2 19285 1727203941.63524: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203941.63527: Calling all_plugins_play to load vars for managed-node2 19285 1727203941.63528: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203941.63530: Calling groups_plugins_play to load vars for managed-node2 19285 1727203941.64152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203941.65017: done with get_vars() 19285 1727203941.65034: done queuing things up, now waiting for results queue to drain 19285 1727203941.65035: results queue empty 19285 1727203941.65036: checking for any_errors_fatal 19285 1727203941.65037: done checking for any_errors_fatal 19285 1727203941.65037: checking for max_fail_percentage 19285 1727203941.65038: done checking for max_fail_percentage 19285 1727203941.65038: checking to see if all hosts have failed and the running result is not ok 19285 1727203941.65039: done checking to see if all hosts have failed 19285 1727203941.65039: getting the remaining hosts for this loop 19285 1727203941.65040: done getting the remaining hosts for this loop 19285 1727203941.65041: getting the next task for host managed-node2 19285 1727203941.65043: done getting next task for host managed-node2 19285 1727203941.65044: ^ task is: None 19285 1727203941.65045: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203941.65045: done queuing things up, now waiting for results queue to drain 19285 1727203941.65046: results queue empty 19285 1727203941.65046: checking for any_errors_fatal 19285 1727203941.65047: done checking for any_errors_fatal 19285 1727203941.65047: checking for max_fail_percentage 19285 1727203941.65048: done checking for max_fail_percentage 19285 1727203941.65048: checking to see if all hosts have failed and the running result is not ok 19285 1727203941.65048: done checking to see if all hosts have failed 19285 1727203941.65049: getting the next task for host managed-node2 19285 1727203941.65051: done getting next task for host managed-node2 19285 1727203941.65051: ^ task is: None 19285 1727203941.65053: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203941.65091: in VariableManager get_vars() 19285 1727203941.65101: done with get_vars() 19285 1727203941.65105: in VariableManager get_vars() 19285 1727203941.65110: done with get_vars() 19285 1727203941.65113: variable 'omit' from source: magic vars 19285 1727203941.65193: variable 'task' from source: play vars 19285 1727203941.65215: in VariableManager get_vars() 19285 1727203941.65222: done with get_vars() 19285 1727203941.65233: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_device_absent.yml] ************************* 19285 1727203941.65391: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19285 1727203941.65411: getting the remaining hosts for this loop 19285 1727203941.65412: done getting the remaining hosts for this loop 19285 1727203941.65414: getting the next task for host managed-node2 19285 1727203941.65416: done getting next task for host managed-node2 19285 1727203941.65417: ^ task is: TASK: Gathering Facts 19285 1727203941.65418: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203941.65419: getting variables 19285 1727203941.65420: in VariableManager get_vars() 19285 1727203941.65425: Calling all_inventory to load vars for managed-node2 19285 1727203941.65427: Calling groups_inventory to load vars for managed-node2 19285 1727203941.65428: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203941.65431: Calling all_plugins_play to load vars for managed-node2 19285 1727203941.65433: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203941.65434: Calling groups_plugins_play to load vars for managed-node2 19285 1727203941.66163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203941.67008: done with get_vars() 19285 1727203941.67025: done getting variables 19285 1727203941.67054: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Tuesday 24 September 2024 14:52:21 -0400 (0:00:00.101) 0:00:40.745 ***** 19285 1727203941.67073: entering _queue_task() for managed-node2/gather_facts 19285 1727203941.67318: worker is 1 (out of 1 available) 19285 1727203941.67329: exiting _queue_task() for managed-node2/gather_facts 19285 1727203941.67340: done queuing things up, now waiting for results queue to drain 19285 1727203941.67341: waiting for pending results... 19285 1727203941.67511: running TaskExecutor() for managed-node2/TASK: Gathering Facts 19285 1727203941.67571: in run() - task 028d2410-947f-f31b-fb3f-0000000004c5 19285 1727203941.67586: variable 'ansible_search_path' from source: unknown 19285 1727203941.67614: calling self._execute() 19285 1727203941.67691: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203941.67694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203941.67698: variable 'omit' from source: magic vars 19285 1727203941.67957: variable 'ansible_distribution_major_version' from source: facts 19285 1727203941.67967: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203941.67972: variable 'omit' from source: magic vars 19285 1727203941.67996: variable 'omit' from source: magic vars 19285 1727203941.68026: variable 'omit' from source: magic vars 19285 1727203941.68061: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203941.68088: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203941.68104: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203941.68125: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203941.68131: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203941.68156: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203941.68161: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203941.68164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203941.68233: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203941.68245: Set connection var ansible_pipelining to False 19285 1727203941.68248: Set connection var ansible_timeout to 10 19285 1727203941.68251: Set connection var ansible_shell_type to sh 19285 1727203941.68262: Set connection var ansible_shell_executable to /bin/sh 19285 1727203941.68264: Set connection var ansible_connection to ssh 19285 1727203941.68279: variable 'ansible_shell_executable' from source: unknown 19285 1727203941.68282: variable 'ansible_connection' from source: unknown 19285 1727203941.68285: variable 'ansible_module_compression' from source: unknown 19285 1727203941.68287: variable 'ansible_shell_type' from source: unknown 19285 1727203941.68289: variable 'ansible_shell_executable' from source: unknown 19285 1727203941.68292: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203941.68296: variable 'ansible_pipelining' from source: unknown 19285 1727203941.68299: variable 'ansible_timeout' from source: unknown 19285 1727203941.68303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203941.68435: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203941.68444: variable 'omit' from source: magic vars 19285 1727203941.68447: starting attempt loop 19285 1727203941.68450: running the handler 19285 1727203941.68467: variable 'ansible_facts' from source: unknown 19285 1727203941.68487: _low_level_execute_command(): starting 19285 1727203941.68494: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203941.69024: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203941.69029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203941.69033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203941.69078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203941.69082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203941.69099: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203941.69182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203941.70877: stdout chunk (state=3): >>>/root <<< 19285 1727203941.70974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203941.71005: stderr chunk (state=3): >>><<< 19285 1727203941.71009: stdout chunk (state=3): >>><<< 19285 1727203941.71030: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203941.71041: _low_level_execute_command(): starting 19285 1727203941.71047: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203941.710294-22441-11298426076622 `" && echo ansible-tmp-1727203941.710294-22441-11298426076622="` echo /root/.ansible/tmp/ansible-tmp-1727203941.710294-22441-11298426076622 `" ) && sleep 0' 19285 1727203941.71489: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203941.71494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203941.71497: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203941.71508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203941.71548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203941.71551: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203941.71632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203941.73566: stdout chunk (state=3): >>>ansible-tmp-1727203941.710294-22441-11298426076622=/root/.ansible/tmp/ansible-tmp-1727203941.710294-22441-11298426076622 <<< 19285 1727203941.73668: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203941.73698: stderr chunk (state=3): >>><<< 19285 1727203941.73701: stdout chunk (state=3): >>><<< 19285 1727203941.73713: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203941.710294-22441-11298426076622=/root/.ansible/tmp/ansible-tmp-1727203941.710294-22441-11298426076622 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203941.73738: variable 'ansible_module_compression' from source: unknown 19285 1727203941.73779: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19285 1727203941.73828: variable 'ansible_facts' from source: unknown 19285 1727203941.73961: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203941.710294-22441-11298426076622/AnsiballZ_setup.py 19285 1727203941.74058: Sending initial data 19285 1727203941.74061: Sent initial data (152 bytes) 19285 1727203941.74509: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203941.74513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203941.74515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203941.74517: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203941.74519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 19285 1727203941.74521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203941.74583: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203941.74585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203941.74587: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203941.74647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203941.76279: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 19285 1727203941.76282: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203941.76347: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203941.76419: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmpakdbfawr /root/.ansible/tmp/ansible-tmp-1727203941.710294-22441-11298426076622/AnsiballZ_setup.py <<< 19285 1727203941.76422: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203941.710294-22441-11298426076622/AnsiballZ_setup.py" <<< 19285 1727203941.76489: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmpakdbfawr" to remote "/root/.ansible/tmp/ansible-tmp-1727203941.710294-22441-11298426076622/AnsiballZ_setup.py" <<< 19285 1727203941.76492: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203941.710294-22441-11298426076622/AnsiballZ_setup.py" <<< 19285 1727203941.77676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203941.77713: stderr chunk (state=3): >>><<< 19285 1727203941.77716: stdout chunk (state=3): >>><<< 19285 1727203941.77732: done transferring module to remote 19285 1727203941.77740: _low_level_execute_command(): starting 19285 1727203941.77744: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203941.710294-22441-11298426076622/ /root/.ansible/tmp/ansible-tmp-1727203941.710294-22441-11298426076622/AnsiballZ_setup.py && sleep 0' 19285 1727203941.78186: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203941.78190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 19285 1727203941.78192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203941.78194: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203941.78196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 19285 1727203941.78201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203941.78252: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203941.78255: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203941.78325: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203941.80182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203941.80203: stderr chunk (state=3): >>><<< 19285 1727203941.80206: stdout chunk (state=3): >>><<< 19285 1727203941.80218: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203941.80221: _low_level_execute_command(): starting 19285 1727203941.80226: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203941.710294-22441-11298426076622/AnsiballZ_setup.py && sleep 0' 19285 1727203941.80647: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203941.80651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203941.80655: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203941.80657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 19285 1727203941.80659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203941.80708: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203941.80711: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203941.80796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203942.43216: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "52", "second": "22", "epoch": "1727203942", "epoch_int": "1727203942", "date": "2024-09-24", "time": "14:52:22", "iso8601_micro": "2024-09-24T18:52:22.078577Z", "iso8601": "2024-09-24T18:52:22Z", "iso8601_basic": "20240924T145222078577", "iso8601_basic_short": "20240924T145222", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "<<< 19285 1727203942.43266: stdout chunk (state=3): >>>enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.6201171875, "5m": 0.4326171875, "15m": 0.2158203125}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2935, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 596, "free": 2935}, "nocache": {"free": 3292, "used": 239}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 528, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261787844608, "block_size": 4096, "block_total": 65519099, "block_available": 63913048, "block_used": 1606051, "inode_total": 131070960, "inode_available": 131027264, "inode_used": 43696, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19285 1727203942.45220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203942.45248: stderr chunk (state=3): >>><<< 19285 1727203942.45251: stdout chunk (state=3): >>><<< 19285 1727203942.45282: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "52", "second": "22", "epoch": "1727203942", "epoch_int": "1727203942", "date": "2024-09-24", "time": "14:52:22", "iso8601_micro": "2024-09-24T18:52:22.078577Z", "iso8601": "2024-09-24T18:52:22Z", "iso8601_basic": "20240924T145222078577", "iso8601_basic_short": "20240924T145222", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.6201171875, "5m": 0.4326171875, "15m": 0.2158203125}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2935, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 596, "free": 2935}, "nocache": {"free": 3292, "used": 239}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 528, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261787844608, "block_size": 4096, "block_total": 65519099, "block_available": 63913048, "block_used": 1606051, "inode_total": 131070960, "inode_available": 131027264, "inode_used": 43696, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203942.45510: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203941.710294-22441-11298426076622/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203942.45528: _low_level_execute_command(): starting 19285 1727203942.45531: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203941.710294-22441-11298426076622/ > /dev/null 2>&1 && sleep 0' 19285 1727203942.45966: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203942.45970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203942.45972: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19285 1727203942.45974: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203942.45978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203942.46028: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203942.46034: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203942.46036: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203942.46107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203942.47918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203942.47940: stderr chunk (state=3): >>><<< 19285 1727203942.47943: stdout chunk (state=3): >>><<< 19285 1727203942.47957: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203942.47965: handler run complete 19285 1727203942.48041: variable 'ansible_facts' from source: unknown 19285 1727203942.48119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203942.48295: variable 'ansible_facts' from source: unknown 19285 1727203942.48346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203942.48424: attempt loop complete, returning result 19285 1727203942.48427: _execute() done 19285 1727203942.48429: dumping result to json 19285 1727203942.48447: done dumping result, returning 19285 1727203942.48455: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [028d2410-947f-f31b-fb3f-0000000004c5] 19285 1727203942.48462: sending task result for task 028d2410-947f-f31b-fb3f-0000000004c5 19285 1727203942.48734: done sending task result for task 028d2410-947f-f31b-fb3f-0000000004c5 19285 1727203942.48736: WORKER PROCESS EXITING ok: [managed-node2] 19285 1727203942.48957: no more pending results, returning what we have 19285 1727203942.48962: results queue empty 19285 1727203942.48962: checking for any_errors_fatal 19285 1727203942.48963: done checking for any_errors_fatal 19285 1727203942.48964: checking for max_fail_percentage 19285 1727203942.48965: done checking for max_fail_percentage 19285 1727203942.48966: checking to see if all hosts have failed and the running result is not ok 19285 1727203942.48966: done checking to see if all hosts have failed 19285 1727203942.48967: getting the remaining hosts for this loop 19285 1727203942.48968: done getting the remaining hosts for this loop 19285 1727203942.48970: getting the next task for host managed-node2 19285 1727203942.48974: done getting next task for host managed-node2 19285 1727203942.48977: ^ task is: TASK: meta (flush_handlers) 19285 1727203942.48979: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203942.48982: getting variables 19285 1727203942.48983: in VariableManager get_vars() 19285 1727203942.48999: Calling all_inventory to load vars for managed-node2 19285 1727203942.49000: Calling groups_inventory to load vars for managed-node2 19285 1727203942.49003: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203942.49010: Calling all_plugins_play to load vars for managed-node2 19285 1727203942.49012: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203942.49014: Calling groups_plugins_play to load vars for managed-node2 19285 1727203942.50244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203942.51772: done with get_vars() 19285 1727203942.51797: done getting variables 19285 1727203942.51892: in VariableManager get_vars() 19285 1727203942.51906: Calling all_inventory to load vars for managed-node2 19285 1727203942.51909: Calling groups_inventory to load vars for managed-node2 19285 1727203942.51911: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203942.51917: Calling all_plugins_play to load vars for managed-node2 19285 1727203942.51919: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203942.51922: Calling groups_plugins_play to load vars for managed-node2 19285 1727203942.52905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203942.53992: done with get_vars() 19285 1727203942.54012: done queuing things up, now waiting for results queue to drain 19285 1727203942.54014: results queue empty 19285 1727203942.54014: checking for any_errors_fatal 19285 1727203942.54017: done checking for any_errors_fatal 19285 1727203942.54022: checking for max_fail_percentage 19285 1727203942.54023: done checking for max_fail_percentage 19285 1727203942.54024: checking to see if all hosts have failed and the running result is not ok 19285 1727203942.54025: done checking to see if all hosts have failed 19285 1727203942.54026: getting the remaining hosts for this loop 19285 1727203942.54026: done getting the remaining hosts for this loop 19285 1727203942.54031: getting the next task for host managed-node2 19285 1727203942.54035: done getting next task for host managed-node2 19285 1727203942.54038: ^ task is: TASK: Include the task '{{ task }}' 19285 1727203942.54039: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203942.54042: getting variables 19285 1727203942.54043: in VariableManager get_vars() 19285 1727203942.54056: Calling all_inventory to load vars for managed-node2 19285 1727203942.54058: Calling groups_inventory to load vars for managed-node2 19285 1727203942.54061: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203942.54067: Calling all_plugins_play to load vars for managed-node2 19285 1727203942.54069: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203942.54072: Calling groups_plugins_play to load vars for managed-node2 19285 1727203942.54897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203942.55754: done with get_vars() 19285 1727203942.55769: done getting variables 19285 1727203942.55891: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_device_absent.yml'] *********************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Tuesday 24 September 2024 14:52:22 -0400 (0:00:00.888) 0:00:41.633 ***** 19285 1727203942.55913: entering _queue_task() for managed-node2/include_tasks 19285 1727203942.56193: worker is 1 (out of 1 available) 19285 1727203942.56204: exiting _queue_task() for managed-node2/include_tasks 19285 1727203942.56217: done queuing things up, now waiting for results queue to drain 19285 1727203942.56218: waiting for pending results... 19285 1727203942.56396: running TaskExecutor() for managed-node2/TASK: Include the task 'tasks/assert_device_absent.yml' 19285 1727203942.56473: in run() - task 028d2410-947f-f31b-fb3f-000000000077 19285 1727203942.56487: variable 'ansible_search_path' from source: unknown 19285 1727203942.56516: calling self._execute() 19285 1727203942.56594: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203942.56599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203942.56607: variable 'omit' from source: magic vars 19285 1727203942.56878: variable 'ansible_distribution_major_version' from source: facts 19285 1727203942.56890: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203942.56893: variable 'task' from source: play vars 19285 1727203942.56939: variable 'task' from source: play vars 19285 1727203942.56946: _execute() done 19285 1727203942.56950: dumping result to json 19285 1727203942.56953: done dumping result, returning 19285 1727203942.56959: done running TaskExecutor() for managed-node2/TASK: Include the task 'tasks/assert_device_absent.yml' [028d2410-947f-f31b-fb3f-000000000077] 19285 1727203942.56967: sending task result for task 028d2410-947f-f31b-fb3f-000000000077 19285 1727203942.57059: done sending task result for task 028d2410-947f-f31b-fb3f-000000000077 19285 1727203942.57062: WORKER PROCESS EXITING 19285 1727203942.57133: no more pending results, returning what we have 19285 1727203942.57139: in VariableManager get_vars() 19285 1727203942.57173: Calling all_inventory to load vars for managed-node2 19285 1727203942.57179: Calling groups_inventory to load vars for managed-node2 19285 1727203942.57184: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203942.57199: Calling all_plugins_play to load vars for managed-node2 19285 1727203942.57201: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203942.57204: Calling groups_plugins_play to load vars for managed-node2 19285 1727203942.58556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203942.59763: done with get_vars() 19285 1727203942.59787: variable 'ansible_search_path' from source: unknown 19285 1727203942.59799: we have included files to process 19285 1727203942.59799: generating all_blocks data 19285 1727203942.59800: done generating all_blocks data 19285 1727203942.59801: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 19285 1727203942.59802: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 19285 1727203942.59803: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 19285 1727203942.59883: in VariableManager get_vars() 19285 1727203942.59898: done with get_vars() 19285 1727203942.59977: done processing included file 19285 1727203942.59979: iterating over new_blocks loaded from include file 19285 1727203942.59979: in VariableManager get_vars() 19285 1727203942.59987: done with get_vars() 19285 1727203942.59988: filtering new block on tags 19285 1727203942.59999: done filtering new block on tags 19285 1727203942.60001: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node2 19285 1727203942.60004: extending task lists for all hosts with included blocks 19285 1727203942.60022: done extending task lists 19285 1727203942.60023: done processing included files 19285 1727203942.60024: results queue empty 19285 1727203942.60024: checking for any_errors_fatal 19285 1727203942.60025: done checking for any_errors_fatal 19285 1727203942.60025: checking for max_fail_percentage 19285 1727203942.60026: done checking for max_fail_percentage 19285 1727203942.60027: checking to see if all hosts have failed and the running result is not ok 19285 1727203942.60027: done checking to see if all hosts have failed 19285 1727203942.60027: getting the remaining hosts for this loop 19285 1727203942.60028: done getting the remaining hosts for this loop 19285 1727203942.60030: getting the next task for host managed-node2 19285 1727203942.60032: done getting next task for host managed-node2 19285 1727203942.60034: ^ task is: TASK: Include the task 'get_interface_stat.yml' 19285 1727203942.60035: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203942.60037: getting variables 19285 1727203942.60038: in VariableManager get_vars() 19285 1727203942.60042: Calling all_inventory to load vars for managed-node2 19285 1727203942.60044: Calling groups_inventory to load vars for managed-node2 19285 1727203942.60045: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203942.60049: Calling all_plugins_play to load vars for managed-node2 19285 1727203942.60050: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203942.60052: Calling groups_plugins_play to load vars for managed-node2 19285 1727203942.64178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203942.65296: done with get_vars() 19285 1727203942.65314: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 14:52:22 -0400 (0:00:00.094) 0:00:41.728 ***** 19285 1727203942.65364: entering _queue_task() for managed-node2/include_tasks 19285 1727203942.65696: worker is 1 (out of 1 available) 19285 1727203942.65709: exiting _queue_task() for managed-node2/include_tasks 19285 1727203942.65722: done queuing things up, now waiting for results queue to drain 19285 1727203942.65724: waiting for pending results... 19285 1727203942.65986: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 19285 1727203942.66054: in run() - task 028d2410-947f-f31b-fb3f-0000000004d6 19285 1727203942.66069: variable 'ansible_search_path' from source: unknown 19285 1727203942.66072: variable 'ansible_search_path' from source: unknown 19285 1727203942.66103: calling self._execute() 19285 1727203942.66190: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203942.66195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203942.66203: variable 'omit' from source: magic vars 19285 1727203942.66631: variable 'ansible_distribution_major_version' from source: facts 19285 1727203942.66640: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203942.66645: _execute() done 19285 1727203942.66647: dumping result to json 19285 1727203942.66650: done dumping result, returning 19285 1727203942.66654: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [028d2410-947f-f31b-fb3f-0000000004d6] 19285 1727203942.66656: sending task result for task 028d2410-947f-f31b-fb3f-0000000004d6 19285 1727203942.66742: done sending task result for task 028d2410-947f-f31b-fb3f-0000000004d6 19285 1727203942.66745: WORKER PROCESS EXITING 19285 1727203942.66822: no more pending results, returning what we have 19285 1727203942.66827: in VariableManager get_vars() 19285 1727203942.66871: Calling all_inventory to load vars for managed-node2 19285 1727203942.66874: Calling groups_inventory to load vars for managed-node2 19285 1727203942.66880: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203942.66892: Calling all_plugins_play to load vars for managed-node2 19285 1727203942.66895: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203942.66897: Calling groups_plugins_play to load vars for managed-node2 19285 1727203942.68146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203942.69056: done with get_vars() 19285 1727203942.69078: variable 'ansible_search_path' from source: unknown 19285 1727203942.69079: variable 'ansible_search_path' from source: unknown 19285 1727203942.69087: variable 'task' from source: play vars 19285 1727203942.69197: variable 'task' from source: play vars 19285 1727203942.69237: we have included files to process 19285 1727203942.69238: generating all_blocks data 19285 1727203942.69240: done generating all_blocks data 19285 1727203942.69243: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 19285 1727203942.69244: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 19285 1727203942.69247: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 19285 1727203942.69424: done processing included file 19285 1727203942.69427: iterating over new_blocks loaded from include file 19285 1727203942.69428: in VariableManager get_vars() 19285 1727203942.69447: done with get_vars() 19285 1727203942.69449: filtering new block on tags 19285 1727203942.69468: done filtering new block on tags 19285 1727203942.69471: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 19285 1727203942.69479: extending task lists for all hosts with included blocks 19285 1727203942.69581: done extending task lists 19285 1727203942.69582: done processing included files 19285 1727203942.69583: results queue empty 19285 1727203942.69583: checking for any_errors_fatal 19285 1727203942.69586: done checking for any_errors_fatal 19285 1727203942.69586: checking for max_fail_percentage 19285 1727203942.69587: done checking for max_fail_percentage 19285 1727203942.69588: checking to see if all hosts have failed and the running result is not ok 19285 1727203942.69588: done checking to see if all hosts have failed 19285 1727203942.69589: getting the remaining hosts for this loop 19285 1727203942.69590: done getting the remaining hosts for this loop 19285 1727203942.69591: getting the next task for host managed-node2 19285 1727203942.69595: done getting next task for host managed-node2 19285 1727203942.69597: ^ task is: TASK: Get stat for interface {{ interface }} 19285 1727203942.69600: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203942.69602: getting variables 19285 1727203942.69603: in VariableManager get_vars() 19285 1727203942.69612: Calling all_inventory to load vars for managed-node2 19285 1727203942.69615: Calling groups_inventory to load vars for managed-node2 19285 1727203942.69617: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203942.69622: Calling all_plugins_play to load vars for managed-node2 19285 1727203942.69624: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203942.69625: Calling groups_plugins_play to load vars for managed-node2 19285 1727203942.70527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203942.71600: done with get_vars() 19285 1727203942.71620: done getting variables 19285 1727203942.71747: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:52:22 -0400 (0:00:00.064) 0:00:41.792 ***** 19285 1727203942.71770: entering _queue_task() for managed-node2/stat 19285 1727203942.72043: worker is 1 (out of 1 available) 19285 1727203942.72057: exiting _queue_task() for managed-node2/stat 19285 1727203942.72073: done queuing things up, now waiting for results queue to drain 19285 1727203942.72074: waiting for pending results... 19285 1727203942.72313: running TaskExecutor() for managed-node2/TASK: Get stat for interface LSR-TST-br31 19285 1727203942.72437: in run() - task 028d2410-947f-f31b-fb3f-0000000004e1 19285 1727203942.72449: variable 'ansible_search_path' from source: unknown 19285 1727203942.72453: variable 'ansible_search_path' from source: unknown 19285 1727203942.72484: calling self._execute() 19285 1727203942.72567: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203942.72571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203942.72578: variable 'omit' from source: magic vars 19285 1727203942.72887: variable 'ansible_distribution_major_version' from source: facts 19285 1727203942.72921: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203942.72925: variable 'omit' from source: magic vars 19285 1727203942.72970: variable 'omit' from source: magic vars 19285 1727203942.73049: variable 'interface' from source: set_fact 19285 1727203942.73065: variable 'omit' from source: magic vars 19285 1727203942.73100: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203942.73128: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203942.73145: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203942.73159: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203942.73170: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203942.73196: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203942.73199: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203942.73203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203942.73271: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203942.73279: Set connection var ansible_pipelining to False 19285 1727203942.73284: Set connection var ansible_timeout to 10 19285 1727203942.73287: Set connection var ansible_shell_type to sh 19285 1727203942.73293: Set connection var ansible_shell_executable to /bin/sh 19285 1727203942.73296: Set connection var ansible_connection to ssh 19285 1727203942.73310: variable 'ansible_shell_executable' from source: unknown 19285 1727203942.73316: variable 'ansible_connection' from source: unknown 19285 1727203942.73318: variable 'ansible_module_compression' from source: unknown 19285 1727203942.73321: variable 'ansible_shell_type' from source: unknown 19285 1727203942.73323: variable 'ansible_shell_executable' from source: unknown 19285 1727203942.73325: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203942.73328: variable 'ansible_pipelining' from source: unknown 19285 1727203942.73330: variable 'ansible_timeout' from source: unknown 19285 1727203942.73332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203942.73483: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19285 1727203942.73492: variable 'omit' from source: magic vars 19285 1727203942.73496: starting attempt loop 19285 1727203942.73499: running the handler 19285 1727203942.73516: _low_level_execute_command(): starting 19285 1727203942.73547: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203942.74050: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203942.74055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203942.74059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203942.74130: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203942.74133: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203942.74134: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203942.74206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203942.75912: stdout chunk (state=3): >>>/root <<< 19285 1727203942.76020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203942.76064: stderr chunk (state=3): >>><<< 19285 1727203942.76068: stdout chunk (state=3): >>><<< 19285 1727203942.76130: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203942.76134: _low_level_execute_command(): starting 19285 1727203942.76137: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203942.7609644-22520-243012788660241 `" && echo ansible-tmp-1727203942.7609644-22520-243012788660241="` echo /root/.ansible/tmp/ansible-tmp-1727203942.7609644-22520-243012788660241 `" ) && sleep 0' 19285 1727203942.76601: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203942.76608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203942.76620: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203942.76622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203942.76687: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203942.76691: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203942.76765: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203942.78711: stdout chunk (state=3): >>>ansible-tmp-1727203942.7609644-22520-243012788660241=/root/.ansible/tmp/ansible-tmp-1727203942.7609644-22520-243012788660241 <<< 19285 1727203942.78818: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203942.78846: stderr chunk (state=3): >>><<< 19285 1727203942.78849: stdout chunk (state=3): >>><<< 19285 1727203942.78895: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203942.7609644-22520-243012788660241=/root/.ansible/tmp/ansible-tmp-1727203942.7609644-22520-243012788660241 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203942.78928: variable 'ansible_module_compression' from source: unknown 19285 1727203942.78979: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 19285 1727203942.79013: variable 'ansible_facts' from source: unknown 19285 1727203942.79063: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203942.7609644-22520-243012788660241/AnsiballZ_stat.py 19285 1727203942.79171: Sending initial data 19285 1727203942.79174: Sent initial data (153 bytes) 19285 1727203942.79666: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203942.79670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203942.79685: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 19285 1727203942.79699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203942.79751: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203942.79758: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203942.79761: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203942.79828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203942.81393: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 19285 1727203942.81398: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203942.81462: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203942.81535: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmpmi9zm6i4 /root/.ansible/tmp/ansible-tmp-1727203942.7609644-22520-243012788660241/AnsiballZ_stat.py <<< 19285 1727203942.81538: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203942.7609644-22520-243012788660241/AnsiballZ_stat.py" <<< 19285 1727203942.81607: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmpmi9zm6i4" to remote "/root/.ansible/tmp/ansible-tmp-1727203942.7609644-22520-243012788660241/AnsiballZ_stat.py" <<< 19285 1727203942.81610: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203942.7609644-22520-243012788660241/AnsiballZ_stat.py" <<< 19285 1727203942.82253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203942.82301: stderr chunk (state=3): >>><<< 19285 1727203942.82305: stdout chunk (state=3): >>><<< 19285 1727203942.82322: done transferring module to remote 19285 1727203942.82330: _low_level_execute_command(): starting 19285 1727203942.82335: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203942.7609644-22520-243012788660241/ /root/.ansible/tmp/ansible-tmp-1727203942.7609644-22520-243012788660241/AnsiballZ_stat.py && sleep 0' 19285 1727203942.82781: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203942.82784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 19285 1727203942.82787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203942.82789: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 19285 1727203942.82797: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203942.82799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203942.82844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203942.82852: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203942.82923: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203942.84663: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203942.84689: stderr chunk (state=3): >>><<< 19285 1727203942.84693: stdout chunk (state=3): >>><<< 19285 1727203942.84707: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203942.84711: _low_level_execute_command(): starting 19285 1727203942.84714: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203942.7609644-22520-243012788660241/AnsiballZ_stat.py && sleep 0' 19285 1727203942.85157: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203942.85160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 19285 1727203942.85164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203942.85166: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203942.85168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203942.85221: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203942.85227: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203942.85306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203943.00621: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 19285 1727203943.01752: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203943.02053: stderr chunk (state=3): >>>Shared connection to 10.31.13.254 closed. <<< 19285 1727203943.02057: stdout chunk (state=3): >>><<< 19285 1727203943.02060: stderr chunk (state=3): >>><<< 19285 1727203943.02062: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203943.02065: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203942.7609644-22520-243012788660241/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203943.02067: _low_level_execute_command(): starting 19285 1727203943.02069: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203942.7609644-22520-243012788660241/ > /dev/null 2>&1 && sleep 0' 19285 1727203943.02846: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203943.02984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203943.03085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203943.03148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203943.05014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203943.05018: stdout chunk (state=3): >>><<< 19285 1727203943.05025: stderr chunk (state=3): >>><<< 19285 1727203943.05040: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203943.05046: handler run complete 19285 1727203943.05065: attempt loop complete, returning result 19285 1727203943.05071: _execute() done 19285 1727203943.05074: dumping result to json 19285 1727203943.05077: done dumping result, returning 19285 1727203943.05086: done running TaskExecutor() for managed-node2/TASK: Get stat for interface LSR-TST-br31 [028d2410-947f-f31b-fb3f-0000000004e1] 19285 1727203943.05088: sending task result for task 028d2410-947f-f31b-fb3f-0000000004e1 19285 1727203943.05183: done sending task result for task 028d2410-947f-f31b-fb3f-0000000004e1 ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 19285 1727203943.05234: no more pending results, returning what we have 19285 1727203943.05238: results queue empty 19285 1727203943.05239: checking for any_errors_fatal 19285 1727203943.05241: done checking for any_errors_fatal 19285 1727203943.05242: checking for max_fail_percentage 19285 1727203943.05243: done checking for max_fail_percentage 19285 1727203943.05244: checking to see if all hosts have failed and the running result is not ok 19285 1727203943.05245: done checking to see if all hosts have failed 19285 1727203943.05245: getting the remaining hosts for this loop 19285 1727203943.05247: done getting the remaining hosts for this loop 19285 1727203943.05250: getting the next task for host managed-node2 19285 1727203943.05262: done getting next task for host managed-node2 19285 1727203943.05264: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 19285 1727203943.05267: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203943.05271: getting variables 19285 1727203943.05273: in VariableManager get_vars() 19285 1727203943.05305: Calling all_inventory to load vars for managed-node2 19285 1727203943.05308: Calling groups_inventory to load vars for managed-node2 19285 1727203943.05312: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203943.05322: Calling all_plugins_play to load vars for managed-node2 19285 1727203943.05325: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203943.05327: Calling groups_plugins_play to load vars for managed-node2 19285 1727203943.05889: WORKER PROCESS EXITING 19285 1727203943.06604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203943.09965: done with get_vars() 19285 1727203943.10135: done getting variables 19285 1727203943.10200: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19285 1727203943.10432: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'LSR-TST-br31'] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 14:52:23 -0400 (0:00:00.388) 0:00:42.180 ***** 19285 1727203943.10581: entering _queue_task() for managed-node2/assert 19285 1727203943.11701: worker is 1 (out of 1 available) 19285 1727203943.11717: exiting _queue_task() for managed-node2/assert 19285 1727203943.11952: done queuing things up, now waiting for results queue to drain 19285 1727203943.11954: waiting for pending results... 19285 1727203943.12153: running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'LSR-TST-br31' 19285 1727203943.12412: in run() - task 028d2410-947f-f31b-fb3f-0000000004d7 19285 1727203943.12515: variable 'ansible_search_path' from source: unknown 19285 1727203943.12518: variable 'ansible_search_path' from source: unknown 19285 1727203943.12544: calling self._execute() 19285 1727203943.12751: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203943.12761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203943.12774: variable 'omit' from source: magic vars 19285 1727203943.13464: variable 'ansible_distribution_major_version' from source: facts 19285 1727203943.13705: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203943.13708: variable 'omit' from source: magic vars 19285 1727203943.13711: variable 'omit' from source: magic vars 19285 1727203943.13822: variable 'interface' from source: set_fact 19285 1727203943.13846: variable 'omit' from source: magic vars 19285 1727203943.13966: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203943.14010: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203943.14247: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203943.14250: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203943.14253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203943.14256: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203943.14464: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203943.14467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203943.14469: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203943.14584: Set connection var ansible_pipelining to False 19285 1727203943.14597: Set connection var ansible_timeout to 10 19285 1727203943.14605: Set connection var ansible_shell_type to sh 19285 1727203943.14617: Set connection var ansible_shell_executable to /bin/sh 19285 1727203943.14624: Set connection var ansible_connection to ssh 19285 1727203943.14651: variable 'ansible_shell_executable' from source: unknown 19285 1727203943.14659: variable 'ansible_connection' from source: unknown 19285 1727203943.14665: variable 'ansible_module_compression' from source: unknown 19285 1727203943.14671: variable 'ansible_shell_type' from source: unknown 19285 1727203943.14690: variable 'ansible_shell_executable' from source: unknown 19285 1727203943.14901: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203943.14904: variable 'ansible_pipelining' from source: unknown 19285 1727203943.14906: variable 'ansible_timeout' from source: unknown 19285 1727203943.14908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203943.15037: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203943.15132: variable 'omit' from source: magic vars 19285 1727203943.15141: starting attempt loop 19285 1727203943.15148: running the handler 19285 1727203943.15417: variable 'interface_stat' from source: set_fact 19285 1727203943.15456: Evaluated conditional (not interface_stat.stat.exists): True 19285 1727203943.15468: handler run complete 19285 1727203943.15660: attempt loop complete, returning result 19285 1727203943.15663: _execute() done 19285 1727203943.15665: dumping result to json 19285 1727203943.15668: done dumping result, returning 19285 1727203943.15670: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'LSR-TST-br31' [028d2410-947f-f31b-fb3f-0000000004d7] 19285 1727203943.15672: sending task result for task 028d2410-947f-f31b-fb3f-0000000004d7 19285 1727203943.15744: done sending task result for task 028d2410-947f-f31b-fb3f-0000000004d7 19285 1727203943.15748: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 19285 1727203943.15807: no more pending results, returning what we have 19285 1727203943.15811: results queue empty 19285 1727203943.15812: checking for any_errors_fatal 19285 1727203943.15821: done checking for any_errors_fatal 19285 1727203943.15822: checking for max_fail_percentage 19285 1727203943.15823: done checking for max_fail_percentage 19285 1727203943.15824: checking to see if all hosts have failed and the running result is not ok 19285 1727203943.15825: done checking to see if all hosts have failed 19285 1727203943.15826: getting the remaining hosts for this loop 19285 1727203943.15828: done getting the remaining hosts for this loop 19285 1727203943.15831: getting the next task for host managed-node2 19285 1727203943.15841: done getting next task for host managed-node2 19285 1727203943.15843: ^ task is: TASK: meta (flush_handlers) 19285 1727203943.15845: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203943.15849: getting variables 19285 1727203943.15851: in VariableManager get_vars() 19285 1727203943.15888: Calling all_inventory to load vars for managed-node2 19285 1727203943.15891: Calling groups_inventory to load vars for managed-node2 19285 1727203943.15895: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203943.15905: Calling all_plugins_play to load vars for managed-node2 19285 1727203943.15908: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203943.15910: Calling groups_plugins_play to load vars for managed-node2 19285 1727203943.18846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203943.22351: done with get_vars() 19285 1727203943.22596: done getting variables 19285 1727203943.23019: in VariableManager get_vars() 19285 1727203943.23031: Calling all_inventory to load vars for managed-node2 19285 1727203943.23034: Calling groups_inventory to load vars for managed-node2 19285 1727203943.23037: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203943.23042: Calling all_plugins_play to load vars for managed-node2 19285 1727203943.23044: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203943.23047: Calling groups_plugins_play to load vars for managed-node2 19285 1727203943.26866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203943.28523: done with get_vars() 19285 1727203943.28554: done queuing things up, now waiting for results queue to drain 19285 1727203943.28556: results queue empty 19285 1727203943.28557: checking for any_errors_fatal 19285 1727203943.28560: done checking for any_errors_fatal 19285 1727203943.28561: checking for max_fail_percentage 19285 1727203943.28562: done checking for max_fail_percentage 19285 1727203943.28563: checking to see if all hosts have failed and the running result is not ok 19285 1727203943.28563: done checking to see if all hosts have failed 19285 1727203943.28574: getting the remaining hosts for this loop 19285 1727203943.28577: done getting the remaining hosts for this loop 19285 1727203943.28580: getting the next task for host managed-node2 19285 1727203943.28584: done getting next task for host managed-node2 19285 1727203943.28586: ^ task is: TASK: meta (flush_handlers) 19285 1727203943.28587: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203943.28590: getting variables 19285 1727203943.28591: in VariableManager get_vars() 19285 1727203943.28601: Calling all_inventory to load vars for managed-node2 19285 1727203943.28603: Calling groups_inventory to load vars for managed-node2 19285 1727203943.28605: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203943.28611: Calling all_plugins_play to load vars for managed-node2 19285 1727203943.28613: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203943.28616: Calling groups_plugins_play to load vars for managed-node2 19285 1727203943.29834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203943.31547: done with get_vars() 19285 1727203943.31571: done getting variables 19285 1727203943.31633: in VariableManager get_vars() 19285 1727203943.31644: Calling all_inventory to load vars for managed-node2 19285 1727203943.31647: Calling groups_inventory to load vars for managed-node2 19285 1727203943.31649: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203943.31654: Calling all_plugins_play to load vars for managed-node2 19285 1727203943.31657: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203943.31659: Calling groups_plugins_play to load vars for managed-node2 19285 1727203943.32861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203943.34910: done with get_vars() 19285 1727203943.34946: done queuing things up, now waiting for results queue to drain 19285 1727203943.34948: results queue empty 19285 1727203943.34949: checking for any_errors_fatal 19285 1727203943.34955: done checking for any_errors_fatal 19285 1727203943.34956: checking for max_fail_percentage 19285 1727203943.34957: done checking for max_fail_percentage 19285 1727203943.34958: checking to see if all hosts have failed and the running result is not ok 19285 1727203943.34959: done checking to see if all hosts have failed 19285 1727203943.34960: getting the remaining hosts for this loop 19285 1727203943.34961: done getting the remaining hosts for this loop 19285 1727203943.34964: getting the next task for host managed-node2 19285 1727203943.34967: done getting next task for host managed-node2 19285 1727203943.34968: ^ task is: None 19285 1727203943.34969: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203943.34971: done queuing things up, now waiting for results queue to drain 19285 1727203943.34972: results queue empty 19285 1727203943.34972: checking for any_errors_fatal 19285 1727203943.34973: done checking for any_errors_fatal 19285 1727203943.34974: checking for max_fail_percentage 19285 1727203943.34977: done checking for max_fail_percentage 19285 1727203943.34978: checking to see if all hosts have failed and the running result is not ok 19285 1727203943.34978: done checking to see if all hosts have failed 19285 1727203943.34979: getting the next task for host managed-node2 19285 1727203943.34981: done getting next task for host managed-node2 19285 1727203943.34982: ^ task is: None 19285 1727203943.34983: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203943.35027: in VariableManager get_vars() 19285 1727203943.35043: done with get_vars() 19285 1727203943.35049: in VariableManager get_vars() 19285 1727203943.35058: done with get_vars() 19285 1727203943.35068: variable 'omit' from source: magic vars 19285 1727203943.35103: in VariableManager get_vars() 19285 1727203943.35114: done with get_vars() 19285 1727203943.35139: variable 'omit' from source: magic vars PLAY [Verify that cleanup restored state to default] *************************** 19285 1727203943.35825: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19285 1727203943.35891: getting the remaining hosts for this loop 19285 1727203943.35893: done getting the remaining hosts for this loop 19285 1727203943.35895: getting the next task for host managed-node2 19285 1727203943.35898: done getting next task for host managed-node2 19285 1727203943.35900: ^ task is: TASK: Gathering Facts 19285 1727203943.35901: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203943.35903: getting variables 19285 1727203943.35904: in VariableManager get_vars() 19285 1727203943.35912: Calling all_inventory to load vars for managed-node2 19285 1727203943.36006: Calling groups_inventory to load vars for managed-node2 19285 1727203943.36010: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203943.36016: Calling all_plugins_play to load vars for managed-node2 19285 1727203943.36018: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203943.36020: Calling groups_plugins_play to load vars for managed-node2 19285 1727203943.39137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203943.42811: done with get_vars() 19285 1727203943.42844: done getting variables 19285 1727203943.42898: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:64 Tuesday 24 September 2024 14:52:23 -0400 (0:00:00.323) 0:00:42.503 ***** 19285 1727203943.42927: entering _queue_task() for managed-node2/gather_facts 19285 1727203943.43908: worker is 1 (out of 1 available) 19285 1727203943.43922: exiting _queue_task() for managed-node2/gather_facts 19285 1727203943.43935: done queuing things up, now waiting for results queue to drain 19285 1727203943.43936: waiting for pending results... 19285 1727203943.44713: running TaskExecutor() for managed-node2/TASK: Gathering Facts 19285 1727203943.45110: in run() - task 028d2410-947f-f31b-fb3f-0000000004fa 19285 1727203943.45115: variable 'ansible_search_path' from source: unknown 19285 1727203943.45354: calling self._execute() 19285 1727203943.45573: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203943.45790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203943.45795: variable 'omit' from source: magic vars 19285 1727203943.46905: variable 'ansible_distribution_major_version' from source: facts 19285 1727203943.46924: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203943.46953: variable 'omit' from source: magic vars 19285 1727203943.47088: variable 'omit' from source: magic vars 19285 1727203943.47129: variable 'omit' from source: magic vars 19285 1727203943.47489: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203943.47493: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203943.47584: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203943.47613: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203943.47722: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203943.47758: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203943.48142: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203943.48145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203943.48148: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203943.48150: Set connection var ansible_pipelining to False 19285 1727203943.48152: Set connection var ansible_timeout to 10 19285 1727203943.48154: Set connection var ansible_shell_type to sh 19285 1727203943.48156: Set connection var ansible_shell_executable to /bin/sh 19285 1727203943.48158: Set connection var ansible_connection to ssh 19285 1727203943.48286: variable 'ansible_shell_executable' from source: unknown 19285 1727203943.48295: variable 'ansible_connection' from source: unknown 19285 1727203943.48367: variable 'ansible_module_compression' from source: unknown 19285 1727203943.48379: variable 'ansible_shell_type' from source: unknown 19285 1727203943.48387: variable 'ansible_shell_executable' from source: unknown 19285 1727203943.48394: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203943.48402: variable 'ansible_pipelining' from source: unknown 19285 1727203943.48408: variable 'ansible_timeout' from source: unknown 19285 1727203943.48416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203943.48844: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203943.48865: variable 'omit' from source: magic vars 19285 1727203943.48878: starting attempt loop 19285 1727203943.48904: running the handler 19285 1727203943.48928: variable 'ansible_facts' from source: unknown 19285 1727203943.49081: _low_level_execute_command(): starting 19285 1727203943.49084: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203943.50801: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 19285 1727203943.50896: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203943.51502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203943.51602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203943.53301: stdout chunk (state=3): >>>/root <<< 19285 1727203943.53437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203943.53469: stdout chunk (state=3): >>><<< 19285 1727203943.53473: stderr chunk (state=3): >>><<< 19285 1727203943.53496: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203943.53515: _low_level_execute_command(): starting 19285 1727203943.53525: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203943.5350327-22615-167321208948615 `" && echo ansible-tmp-1727203943.5350327-22615-167321208948615="` echo /root/.ansible/tmp/ansible-tmp-1727203943.5350327-22615-167321208948615 `" ) && sleep 0' 19285 1727203943.55022: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203943.55036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 19285 1727203943.55047: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203943.55222: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203943.55293: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203943.55494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203943.55583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203943.57529: stdout chunk (state=3): >>>ansible-tmp-1727203943.5350327-22615-167321208948615=/root/.ansible/tmp/ansible-tmp-1727203943.5350327-22615-167321208948615 <<< 19285 1727203943.57634: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203943.57667: stderr chunk (state=3): >>><<< 19285 1727203943.57679: stdout chunk (state=3): >>><<< 19285 1727203943.57782: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203943.5350327-22615-167321208948615=/root/.ansible/tmp/ansible-tmp-1727203943.5350327-22615-167321208948615 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203943.57794: variable 'ansible_module_compression' from source: unknown 19285 1727203943.57899: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19285 1727203943.58183: variable 'ansible_facts' from source: unknown 19285 1727203943.58634: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203943.5350327-22615-167321208948615/AnsiballZ_setup.py 19285 1727203943.58889: Sending initial data 19285 1727203943.58897: Sent initial data (154 bytes) 19285 1727203943.59582: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203943.59598: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203943.59681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203943.59720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203943.59735: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203943.59900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203943.59982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203943.61607: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203943.61681: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203943.61967: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmp0diwpo4y /root/.ansible/tmp/ansible-tmp-1727203943.5350327-22615-167321208948615/AnsiballZ_setup.py <<< 19285 1727203943.61971: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203943.5350327-22615-167321208948615/AnsiballZ_setup.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmp0diwpo4y" to remote "/root/.ansible/tmp/ansible-tmp-1727203943.5350327-22615-167321208948615/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203943.5350327-22615-167321208948615/AnsiballZ_setup.py" <<< 19285 1727203943.66265: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203943.66269: stdout chunk (state=3): >>><<< 19285 1727203943.66271: stderr chunk (state=3): >>><<< 19285 1727203943.66272: done transferring module to remote 19285 1727203943.66274: _low_level_execute_command(): starting 19285 1727203943.66282: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203943.5350327-22615-167321208948615/ /root/.ansible/tmp/ansible-tmp-1727203943.5350327-22615-167321208948615/AnsiballZ_setup.py && sleep 0' 19285 1727203943.67296: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203943.67588: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203943.67624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203943.67692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203943.69544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203943.69624: stderr chunk (state=3): >>><<< 19285 1727203943.69647: stdout chunk (state=3): >>><<< 19285 1727203943.69701: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203943.69833: _low_level_execute_command(): starting 19285 1727203943.69837: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203943.5350327-22615-167321208948615/AnsiballZ_setup.py && sleep 0' 19285 1727203943.70874: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203943.70882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203943.70901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203943.70913: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203943.70922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203943.71005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203943.71018: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203943.71107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203944.34067: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_loadavg": {"1m": 0.65087890625, "5m": 0.4423828125, "15m": 0.22021484375}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2943, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 588, "free": 2943}, "nocache": {"free": 3300, "used": 231}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 530, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261787844608, "block_size": 4096, "block_total": 65519099, "block_available": 63913048, "block_used": 1606051, "inode_total": 131070960, "inode_available": 131027264, "inode_used": 43696, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_local": {}, "ansible_fips": false, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "52", "second": "24", "epoch": "1727203944", "epoch_int": "1727203944", "date": "2024-09-24", "time": "14:52:24", "iso8601_micro": "2024-09-24T18:52:24.301187Z", "iso8601": "2024-09-24T18:52:24Z", "iso8601_basic": "20240924T145224301187", "iso8601_basic_short": "20240924T145224", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19285 1727203944.36253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203944.36257: stdout chunk (state=3): >>><<< 19285 1727203944.36260: stderr chunk (state=3): >>><<< 19285 1727203944.36387: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_loadavg": {"1m": 0.65087890625, "5m": 0.4423828125, "15m": 0.22021484375}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2943, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 588, "free": 2943}, "nocache": {"free": 3300, "used": 231}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 530, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261787844608, "block_size": 4096, "block_total": 65519099, "block_available": 63913048, "block_used": 1606051, "inode_total": 131070960, "inode_available": 131027264, "inode_used": 43696, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_local": {}, "ansible_fips": false, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "52", "second": "24", "epoch": "1727203944", "epoch_int": "1727203944", "date": "2024-09-24", "time": "14:52:24", "iso8601_micro": "2024-09-24T18:52:24.301187Z", "iso8601": "2024-09-24T18:52:24Z", "iso8601_basic": "20240924T145224301187", "iso8601_basic_short": "20240924T145224", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203944.37570: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203943.5350327-22615-167321208948615/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203944.37782: _low_level_execute_command(): starting 19285 1727203944.37785: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203943.5350327-22615-167321208948615/ > /dev/null 2>&1 && sleep 0' 19285 1727203944.38993: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203944.39019: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203944.39124: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203944.39187: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203944.39237: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203944.39346: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203944.41192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203944.41249: stderr chunk (state=3): >>><<< 19285 1727203944.41267: stdout chunk (state=3): >>><<< 19285 1727203944.41300: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203944.41313: handler run complete 19285 1727203944.41453: variable 'ansible_facts' from source: unknown 19285 1727203944.41574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203944.41918: variable 'ansible_facts' from source: unknown 19285 1727203944.42009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203944.42191: attempt loop complete, returning result 19285 1727203944.42194: _execute() done 19285 1727203944.42196: dumping result to json 19285 1727203944.42198: done dumping result, returning 19285 1727203944.42210: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [028d2410-947f-f31b-fb3f-0000000004fa] 19285 1727203944.42226: sending task result for task 028d2410-947f-f31b-fb3f-0000000004fa ok: [managed-node2] 19285 1727203944.43533: no more pending results, returning what we have 19285 1727203944.43537: results queue empty 19285 1727203944.43538: checking for any_errors_fatal 19285 1727203944.43539: done checking for any_errors_fatal 19285 1727203944.43540: checking for max_fail_percentage 19285 1727203944.43541: done checking for max_fail_percentage 19285 1727203944.43542: checking to see if all hosts have failed and the running result is not ok 19285 1727203944.43543: done checking to see if all hosts have failed 19285 1727203944.43544: getting the remaining hosts for this loop 19285 1727203944.43545: done getting the remaining hosts for this loop 19285 1727203944.43548: getting the next task for host managed-node2 19285 1727203944.43553: done getting next task for host managed-node2 19285 1727203944.43554: ^ task is: TASK: meta (flush_handlers) 19285 1727203944.43556: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203944.43560: getting variables 19285 1727203944.43562: in VariableManager get_vars() 19285 1727203944.43585: Calling all_inventory to load vars for managed-node2 19285 1727203944.43588: Calling groups_inventory to load vars for managed-node2 19285 1727203944.43591: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203944.43601: Calling all_plugins_play to load vars for managed-node2 19285 1727203944.43604: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203944.43607: Calling groups_plugins_play to load vars for managed-node2 19285 1727203944.43686: done sending task result for task 028d2410-947f-f31b-fb3f-0000000004fa 19285 1727203944.43690: WORKER PROCESS EXITING 19285 1727203944.45982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203944.47847: done with get_vars() 19285 1727203944.47875: done getting variables 19285 1727203944.47948: in VariableManager get_vars() 19285 1727203944.47958: Calling all_inventory to load vars for managed-node2 19285 1727203944.47963: Calling groups_inventory to load vars for managed-node2 19285 1727203944.47966: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203944.47970: Calling all_plugins_play to load vars for managed-node2 19285 1727203944.47973: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203944.48032: Calling groups_plugins_play to load vars for managed-node2 19285 1727203944.49446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203944.51092: done with get_vars() 19285 1727203944.51122: done queuing things up, now waiting for results queue to drain 19285 1727203944.51124: results queue empty 19285 1727203944.51125: checking for any_errors_fatal 19285 1727203944.51129: done checking for any_errors_fatal 19285 1727203944.51130: checking for max_fail_percentage 19285 1727203944.51131: done checking for max_fail_percentage 19285 1727203944.51132: checking to see if all hosts have failed and the running result is not ok 19285 1727203944.51137: done checking to see if all hosts have failed 19285 1727203944.51138: getting the remaining hosts for this loop 19285 1727203944.51139: done getting the remaining hosts for this loop 19285 1727203944.51141: getting the next task for host managed-node2 19285 1727203944.51145: done getting next task for host managed-node2 19285 1727203944.51148: ^ task is: TASK: Verify network state restored to default 19285 1727203944.51149: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203944.51151: getting variables 19285 1727203944.51152: in VariableManager get_vars() 19285 1727203944.51163: Calling all_inventory to load vars for managed-node2 19285 1727203944.51165: Calling groups_inventory to load vars for managed-node2 19285 1727203944.51172: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203944.51179: Calling all_plugins_play to load vars for managed-node2 19285 1727203944.51182: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203944.51185: Calling groups_plugins_play to load vars for managed-node2 19285 1727203944.53516: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203944.55103: done with get_vars() 19285 1727203944.55125: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:67 Tuesday 24 September 2024 14:52:24 -0400 (0:00:01.123) 0:00:43.626 ***** 19285 1727203944.55247: entering _queue_task() for managed-node2/include_tasks 19285 1727203944.55826: worker is 1 (out of 1 available) 19285 1727203944.55839: exiting _queue_task() for managed-node2/include_tasks 19285 1727203944.55852: done queuing things up, now waiting for results queue to drain 19285 1727203944.55854: waiting for pending results... 19285 1727203944.56129: running TaskExecutor() for managed-node2/TASK: Verify network state restored to default 19285 1727203944.56235: in run() - task 028d2410-947f-f31b-fb3f-00000000007a 19285 1727203944.56255: variable 'ansible_search_path' from source: unknown 19285 1727203944.56303: calling self._execute() 19285 1727203944.56400: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203944.56416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203944.56432: variable 'omit' from source: magic vars 19285 1727203944.56813: variable 'ansible_distribution_major_version' from source: facts 19285 1727203944.56829: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203944.56878: _execute() done 19285 1727203944.56883: dumping result to json 19285 1727203944.56885: done dumping result, returning 19285 1727203944.56887: done running TaskExecutor() for managed-node2/TASK: Verify network state restored to default [028d2410-947f-f31b-fb3f-00000000007a] 19285 1727203944.56889: sending task result for task 028d2410-947f-f31b-fb3f-00000000007a 19285 1727203944.57201: no more pending results, returning what we have 19285 1727203944.57205: in VariableManager get_vars() 19285 1727203944.57233: Calling all_inventory to load vars for managed-node2 19285 1727203944.57236: Calling groups_inventory to load vars for managed-node2 19285 1727203944.57239: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203944.57249: Calling all_plugins_play to load vars for managed-node2 19285 1727203944.57251: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203944.57254: Calling groups_plugins_play to load vars for managed-node2 19285 1727203944.57892: done sending task result for task 028d2410-947f-f31b-fb3f-00000000007a 19285 1727203944.57896: WORKER PROCESS EXITING 19285 1727203944.59558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203944.61109: done with get_vars() 19285 1727203944.61131: variable 'ansible_search_path' from source: unknown 19285 1727203944.61147: we have included files to process 19285 1727203944.61148: generating all_blocks data 19285 1727203944.61149: done generating all_blocks data 19285 1727203944.61150: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 19285 1727203944.61151: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 19285 1727203944.61154: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 19285 1727203944.61621: done processing included file 19285 1727203944.61623: iterating over new_blocks loaded from include file 19285 1727203944.61624: in VariableManager get_vars() 19285 1727203944.61637: done with get_vars() 19285 1727203944.61639: filtering new block on tags 19285 1727203944.61655: done filtering new block on tags 19285 1727203944.61657: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node2 19285 1727203944.61662: extending task lists for all hosts with included blocks 19285 1727203944.61696: done extending task lists 19285 1727203944.61697: done processing included files 19285 1727203944.61698: results queue empty 19285 1727203944.61698: checking for any_errors_fatal 19285 1727203944.61700: done checking for any_errors_fatal 19285 1727203944.61700: checking for max_fail_percentage 19285 1727203944.61701: done checking for max_fail_percentage 19285 1727203944.61702: checking to see if all hosts have failed and the running result is not ok 19285 1727203944.61703: done checking to see if all hosts have failed 19285 1727203944.61704: getting the remaining hosts for this loop 19285 1727203944.61705: done getting the remaining hosts for this loop 19285 1727203944.61707: getting the next task for host managed-node2 19285 1727203944.61711: done getting next task for host managed-node2 19285 1727203944.61713: ^ task is: TASK: Check routes and DNS 19285 1727203944.61715: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203944.61717: getting variables 19285 1727203944.61718: in VariableManager get_vars() 19285 1727203944.61726: Calling all_inventory to load vars for managed-node2 19285 1727203944.61728: Calling groups_inventory to load vars for managed-node2 19285 1727203944.61730: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203944.61735: Calling all_plugins_play to load vars for managed-node2 19285 1727203944.61738: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203944.61741: Calling groups_plugins_play to load vars for managed-node2 19285 1727203944.62891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203944.64525: done with get_vars() 19285 1727203944.64543: done getting variables 19285 1727203944.64581: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 14:52:24 -0400 (0:00:00.093) 0:00:43.720 ***** 19285 1727203944.64607: entering _queue_task() for managed-node2/shell 19285 1727203944.64962: worker is 1 (out of 1 available) 19285 1727203944.65079: exiting _queue_task() for managed-node2/shell 19285 1727203944.65092: done queuing things up, now waiting for results queue to drain 19285 1727203944.65094: waiting for pending results... 19285 1727203944.65286: running TaskExecutor() for managed-node2/TASK: Check routes and DNS 19285 1727203944.65406: in run() - task 028d2410-947f-f31b-fb3f-00000000050b 19285 1727203944.65427: variable 'ansible_search_path' from source: unknown 19285 1727203944.65439: variable 'ansible_search_path' from source: unknown 19285 1727203944.65484: calling self._execute() 19285 1727203944.65672: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203944.65713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203944.65759: variable 'omit' from source: magic vars 19285 1727203944.66539: variable 'ansible_distribution_major_version' from source: facts 19285 1727203944.66558: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203944.66573: variable 'omit' from source: magic vars 19285 1727203944.66612: variable 'omit' from source: magic vars 19285 1727203944.66646: variable 'omit' from source: magic vars 19285 1727203944.66698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203944.66735: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203944.66766: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203944.66847: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203944.66852: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203944.66857: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203944.66859: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203944.66866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203944.67153: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203944.67172: Set connection var ansible_pipelining to False 19285 1727203944.67297: Set connection var ansible_timeout to 10 19285 1727203944.67309: Set connection var ansible_shell_type to sh 19285 1727203944.67349: Set connection var ansible_shell_executable to /bin/sh 19285 1727203944.67372: Set connection var ansible_connection to ssh 19285 1727203944.67414: variable 'ansible_shell_executable' from source: unknown 19285 1727203944.67417: variable 'ansible_connection' from source: unknown 19285 1727203944.67420: variable 'ansible_module_compression' from source: unknown 19285 1727203944.67459: variable 'ansible_shell_type' from source: unknown 19285 1727203944.67464: variable 'ansible_shell_executable' from source: unknown 19285 1727203944.67466: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203944.67469: variable 'ansible_pipelining' from source: unknown 19285 1727203944.67472: variable 'ansible_timeout' from source: unknown 19285 1727203944.67587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203944.67994: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203944.67998: variable 'omit' from source: magic vars 19285 1727203944.68000: starting attempt loop 19285 1727203944.68012: running the handler 19285 1727203944.68077: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203944.68268: _low_level_execute_command(): starting 19285 1727203944.68278: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203944.69755: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203944.69783: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203944.69815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203944.69833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203944.69850: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203944.69907: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203944.70008: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203944.70058: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203944.70096: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203944.70152: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203944.70256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203944.71948: stdout chunk (state=3): >>>/root <<< 19285 1727203944.72103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203944.72106: stdout chunk (state=3): >>><<< 19285 1727203944.72108: stderr chunk (state=3): >>><<< 19285 1727203944.72187: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203944.72190: _low_level_execute_command(): starting 19285 1727203944.72193: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203944.7213237-22771-50433019597564 `" && echo ansible-tmp-1727203944.7213237-22771-50433019597564="` echo /root/.ansible/tmp/ansible-tmp-1727203944.7213237-22771-50433019597564 `" ) && sleep 0' 19285 1727203944.72825: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203944.72847: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203944.72859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203944.72884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203944.72915: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203944.72919: stderr chunk (state=3): >>>debug2: match not found <<< 19285 1727203944.72982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203944.73002: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203944.73045: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203944.73071: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203944.73174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203944.75140: stdout chunk (state=3): >>>ansible-tmp-1727203944.7213237-22771-50433019597564=/root/.ansible/tmp/ansible-tmp-1727203944.7213237-22771-50433019597564 <<< 19285 1727203944.75366: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203944.75370: stdout chunk (state=3): >>><<< 19285 1727203944.75373: stderr chunk (state=3): >>><<< 19285 1727203944.75584: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203944.7213237-22771-50433019597564=/root/.ansible/tmp/ansible-tmp-1727203944.7213237-22771-50433019597564 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203944.75596: variable 'ansible_module_compression' from source: unknown 19285 1727203944.75599: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 19285 1727203944.75602: variable 'ansible_facts' from source: unknown 19285 1727203944.75823: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203944.7213237-22771-50433019597564/AnsiballZ_command.py 19285 1727203944.76244: Sending initial data 19285 1727203944.76286: Sent initial data (155 bytes) 19285 1727203944.77046: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203944.77089: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 19285 1727203944.77100: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 19285 1727203944.77153: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203944.77203: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203944.77217: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203944.77235: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203944.77334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203944.78998: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203944.79092: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203944.79538: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmpoz3awx_6 /root/.ansible/tmp/ansible-tmp-1727203944.7213237-22771-50433019597564/AnsiballZ_command.py <<< 19285 1727203944.79546: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203944.7213237-22771-50433019597564/AnsiballZ_command.py" <<< 19285 1727203944.79572: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmpoz3awx_6" to remote "/root/.ansible/tmp/ansible-tmp-1727203944.7213237-22771-50433019597564/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203944.7213237-22771-50433019597564/AnsiballZ_command.py" <<< 19285 1727203944.80666: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203944.80700: stderr chunk (state=3): >>><<< 19285 1727203944.80834: stdout chunk (state=3): >>><<< 19285 1727203944.80837: done transferring module to remote 19285 1727203944.80844: _low_level_execute_command(): starting 19285 1727203944.80847: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203944.7213237-22771-50433019597564/ /root/.ansible/tmp/ansible-tmp-1727203944.7213237-22771-50433019597564/AnsiballZ_command.py && sleep 0' 19285 1727203944.81709: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203944.81723: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203944.81737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203944.81758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203944.81772: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203944.81780: stderr chunk (state=3): >>>debug2: match not found <<< 19285 1727203944.81890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203944.81905: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203944.82023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203944.83891: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203944.83911: stdout chunk (state=3): >>><<< 19285 1727203944.83941: stderr chunk (state=3): >>><<< 19285 1727203944.83975: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203944.84055: _low_level_execute_command(): starting 19285 1727203944.84070: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203944.7213237-22771-50433019597564/AnsiballZ_command.py && sleep 0' 19285 1727203944.84926: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203944.84964: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203944.84984: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203944.85005: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203944.85147: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203945.01512: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:e4:80:fb:2d brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.13.254/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3364sec preferred_lft 3364sec\n inet6 fe80::8ff:e4ff:fe80:fb2d/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.13.254 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.13.254 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:52:25.003819", "end": "2024-09-24 14:52:25.012581", "delta": "0:00:00.008762", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 19285 1727203945.02950: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203945.02982: stderr chunk (state=3): >>><<< 19285 1727203945.02991: stdout chunk (state=3): >>><<< 19285 1727203945.03015: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:e4:80:fb:2d brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.13.254/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3364sec preferred_lft 3364sec\n inet6 fe80::8ff:e4ff:fe80:fb2d/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.13.254 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.13.254 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:52:25.003819", "end": "2024-09-24 14:52:25.012581", "delta": "0:00:00.008762", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203945.03073: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203944.7213237-22771-50433019597564/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203945.03089: _low_level_execute_command(): starting 19285 1727203945.03161: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203944.7213237-22771-50433019597564/ > /dev/null 2>&1 && sleep 0' 19285 1727203945.03663: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203945.03684: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203945.03699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203945.03716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203945.03731: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203945.03740: stderr chunk (state=3): >>>debug2: match not found <<< 19285 1727203945.03751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203945.03767: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19285 1727203945.03781: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 19285 1727203945.03791: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19285 1727203945.03801: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19285 1727203945.03812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203945.03824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203945.03835: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203945.03892: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203945.03978: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203945.03991: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203945.04081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203945.06062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203945.06088: stdout chunk (state=3): >>><<< 19285 1727203945.06485: stderr chunk (state=3): >>><<< 19285 1727203945.06489: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203945.06492: handler run complete 19285 1727203945.06494: Evaluated conditional (False): False 19285 1727203945.06496: attempt loop complete, returning result 19285 1727203945.06498: _execute() done 19285 1727203945.06500: dumping result to json 19285 1727203945.06502: done dumping result, returning 19285 1727203945.06504: done running TaskExecutor() for managed-node2/TASK: Check routes and DNS [028d2410-947f-f31b-fb3f-00000000050b] 19285 1727203945.06506: sending task result for task 028d2410-947f-f31b-fb3f-00000000050b 19285 1727203945.06582: done sending task result for task 028d2410-947f-f31b-fb3f-00000000050b 19285 1727203945.06590: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008762", "end": "2024-09-24 14:52:25.012581", "rc": 0, "start": "2024-09-24 14:52:25.003819" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:e4:80:fb:2d brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.13.254/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 3364sec preferred_lft 3364sec inet6 fe80::8ff:e4ff:fe80:fb2d/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.13.254 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.13.254 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 19285 1727203945.06664: no more pending results, returning what we have 19285 1727203945.06669: results queue empty 19285 1727203945.06670: checking for any_errors_fatal 19285 1727203945.06671: done checking for any_errors_fatal 19285 1727203945.06672: checking for max_fail_percentage 19285 1727203945.06674: done checking for max_fail_percentage 19285 1727203945.06677: checking to see if all hosts have failed and the running result is not ok 19285 1727203945.06678: done checking to see if all hosts have failed 19285 1727203945.06679: getting the remaining hosts for this loop 19285 1727203945.06681: done getting the remaining hosts for this loop 19285 1727203945.06685: getting the next task for host managed-node2 19285 1727203945.06696: done getting next task for host managed-node2 19285 1727203945.06700: ^ task is: TASK: Verify DNS and network connectivity 19285 1727203945.06703: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203945.06711: getting variables 19285 1727203945.06713: in VariableManager get_vars() 19285 1727203945.06744: Calling all_inventory to load vars for managed-node2 19285 1727203945.06747: Calling groups_inventory to load vars for managed-node2 19285 1727203945.06751: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203945.06762: Calling all_plugins_play to load vars for managed-node2 19285 1727203945.06766: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203945.06769: Calling groups_plugins_play to load vars for managed-node2 19285 1727203945.09890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203945.12686: done with get_vars() 19285 1727203945.12712: done getting variables 19285 1727203945.12784: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 14:52:25 -0400 (0:00:00.482) 0:00:44.202 ***** 19285 1727203945.12813: entering _queue_task() for managed-node2/shell 19285 1727203945.13310: worker is 1 (out of 1 available) 19285 1727203945.13323: exiting _queue_task() for managed-node2/shell 19285 1727203945.13336: done queuing things up, now waiting for results queue to drain 19285 1727203945.13337: waiting for pending results... 19285 1727203945.13514: running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity 19285 1727203945.13636: in run() - task 028d2410-947f-f31b-fb3f-00000000050c 19285 1727203945.13676: variable 'ansible_search_path' from source: unknown 19285 1727203945.13680: variable 'ansible_search_path' from source: unknown 19285 1727203945.13709: calling self._execute() 19285 1727203945.13854: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203945.13858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203945.13861: variable 'omit' from source: magic vars 19285 1727203945.14261: variable 'ansible_distribution_major_version' from source: facts 19285 1727203945.14280: Evaluated conditional (ansible_distribution_major_version != '6'): True 19285 1727203945.14443: variable 'ansible_facts' from source: unknown 19285 1727203945.15461: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 19285 1727203945.15477: variable 'omit' from source: magic vars 19285 1727203945.15680: variable 'omit' from source: magic vars 19285 1727203945.15683: variable 'omit' from source: magic vars 19285 1727203945.15815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19285 1727203945.15832: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19285 1727203945.15858: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19285 1727203945.15910: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203945.15935: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19285 1727203945.16033: variable 'inventory_hostname' from source: host vars for 'managed-node2' 19285 1727203945.16043: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203945.16051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203945.16357: Set connection var ansible_module_compression to ZIP_DEFLATED 19285 1727203945.16360: Set connection var ansible_pipelining to False 19285 1727203945.16362: Set connection var ansible_timeout to 10 19285 1727203945.16365: Set connection var ansible_shell_type to sh 19285 1727203945.16366: Set connection var ansible_shell_executable to /bin/sh 19285 1727203945.16466: Set connection var ansible_connection to ssh 19285 1727203945.16470: variable 'ansible_shell_executable' from source: unknown 19285 1727203945.16472: variable 'ansible_connection' from source: unknown 19285 1727203945.16478: variable 'ansible_module_compression' from source: unknown 19285 1727203945.16487: variable 'ansible_shell_type' from source: unknown 19285 1727203945.16494: variable 'ansible_shell_executable' from source: unknown 19285 1727203945.16500: variable 'ansible_host' from source: host vars for 'managed-node2' 19285 1727203945.16508: variable 'ansible_pipelining' from source: unknown 19285 1727203945.16514: variable 'ansible_timeout' from source: unknown 19285 1727203945.16522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 19285 1727203945.16825: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203945.16883: variable 'omit' from source: magic vars 19285 1727203945.16896: starting attempt loop 19285 1727203945.17083: running the handler 19285 1727203945.17087: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19285 1727203945.17090: _low_level_execute_command(): starting 19285 1727203945.17093: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19285 1727203945.18503: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203945.18839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203945.18867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203945.20583: stdout chunk (state=3): >>>/root <<< 19285 1727203945.20725: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203945.20731: stdout chunk (state=3): >>><<< 19285 1727203945.20747: stderr chunk (state=3): >>><<< 19285 1727203945.20765: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203945.20779: _low_level_execute_command(): starting 19285 1727203945.20787: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203945.2076395-22787-253818179087355 `" && echo ansible-tmp-1727203945.2076395-22787-253818179087355="` echo /root/.ansible/tmp/ansible-tmp-1727203945.2076395-22787-253818179087355 `" ) && sleep 0' 19285 1727203945.22289: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203945.22293: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203945.22304: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203945.22486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203945.24486: stdout chunk (state=3): >>>ansible-tmp-1727203945.2076395-22787-253818179087355=/root/.ansible/tmp/ansible-tmp-1727203945.2076395-22787-253818179087355 <<< 19285 1727203945.24584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203945.24628: stderr chunk (state=3): >>><<< 19285 1727203945.24791: stdout chunk (state=3): >>><<< 19285 1727203945.24813: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203945.2076395-22787-253818179087355=/root/.ansible/tmp/ansible-tmp-1727203945.2076395-22787-253818179087355 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203945.24845: variable 'ansible_module_compression' from source: unknown 19285 1727203945.24899: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-192853s23vno8/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 19285 1727203945.24936: variable 'ansible_facts' from source: unknown 19285 1727203945.25120: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203945.2076395-22787-253818179087355/AnsiballZ_command.py 19285 1727203945.25372: Sending initial data 19285 1727203945.25377: Sent initial data (156 bytes) 19285 1727203945.27073: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203945.27079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203945.27082: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19285 1727203945.27085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19285 1727203945.27107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 19285 1727203945.27113: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203945.27126: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203945.27324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203945.29107: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 19285 1727203945.29183: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 19285 1727203945.29248: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-192853s23vno8/tmpww016a_t /root/.ansible/tmp/ansible-tmp-1727203945.2076395-22787-253818179087355/AnsiballZ_command.py <<< 19285 1727203945.29252: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203945.2076395-22787-253818179087355/AnsiballZ_command.py" <<< 19285 1727203945.29470: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-192853s23vno8/tmpww016a_t" to remote "/root/.ansible/tmp/ansible-tmp-1727203945.2076395-22787-253818179087355/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203945.2076395-22787-253818179087355/AnsiballZ_command.py" <<< 19285 1727203945.30701: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203945.30724: stderr chunk (state=3): >>><<< 19285 1727203945.30727: stdout chunk (state=3): >>><<< 19285 1727203945.30786: done transferring module to remote 19285 1727203945.30797: _low_level_execute_command(): starting 19285 1727203945.30802: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203945.2076395-22787-253818179087355/ /root/.ansible/tmp/ansible-tmp-1727203945.2076395-22787-253818179087355/AnsiballZ_command.py && sleep 0' 19285 1727203945.31893: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19285 1727203945.31897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 19285 1727203945.31908: stderr chunk (state=3): >>>debug2: match not found <<< 19285 1727203945.32009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203945.32034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203945.32136: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203945.34032: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203945.34036: stdout chunk (state=3): >>><<< 19285 1727203945.34038: stderr chunk (state=3): >>><<< 19285 1727203945.34063: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203945.34080: _low_level_execute_command(): starting 19285 1727203945.34093: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203945.2076395-22787-253818179087355/AnsiballZ_command.py && sleep 0' 19285 1727203945.35495: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203945.35672: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203945.35920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203945.36110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203945.71082: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 4451 0 --:--:-- --:--:-- --:--:-- 4485\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2744 0 --:--:-- --:--:-- --:--:-- 2771", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:52:25.510505", "end": "2024-09-24 14:52:25.709698", "delta": "0:00:00.199193", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 19285 1727203945.72716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 19285 1727203945.72727: stdout chunk (state=3): >>><<< 19285 1727203945.72749: stderr chunk (state=3): >>><<< 19285 1727203945.72777: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 4451 0 --:--:-- --:--:-- --:--:-- 4485\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2744 0 --:--:-- --:--:-- --:--:-- 2771", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:52:25.510505", "end": "2024-09-24 14:52:25.709698", "delta": "0:00:00.199193", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 19285 1727203945.72835: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203945.2076395-22787-253818179087355/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19285 1727203945.72849: _low_level_execute_command(): starting 19285 1727203945.72859: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203945.2076395-22787-253818179087355/ > /dev/null 2>&1 && sleep 0' 19285 1727203945.73580: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19285 1727203945.73690: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 19285 1727203945.73722: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19285 1727203945.73832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19285 1727203945.75985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19285 1727203945.75989: stdout chunk (state=3): >>><<< 19285 1727203945.75991: stderr chunk (state=3): >>><<< 19285 1727203945.75993: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19285 1727203945.75994: handler run complete 19285 1727203945.75996: Evaluated conditional (False): False 19285 1727203945.75997: attempt loop complete, returning result 19285 1727203945.75999: _execute() done 19285 1727203945.76000: dumping result to json 19285 1727203945.76002: done dumping result, returning 19285 1727203945.76008: done running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity [028d2410-947f-f31b-fb3f-00000000050c] 19285 1727203945.76010: sending task result for task 028d2410-947f-f31b-fb3f-00000000050c 19285 1727203945.76085: done sending task result for task 028d2410-947f-f31b-fb3f-00000000050c 19285 1727203945.76089: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.199193", "end": "2024-09-24 14:52:25.709698", "rc": 0, "start": "2024-09-24 14:52:25.510505" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 4451 0 --:--:-- --:--:-- --:--:-- 4485 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 2744 0 --:--:-- --:--:-- --:--:-- 2771 19285 1727203945.76211: no more pending results, returning what we have 19285 1727203945.76214: results queue empty 19285 1727203945.76215: checking for any_errors_fatal 19285 1727203945.76222: done checking for any_errors_fatal 19285 1727203945.76223: checking for max_fail_percentage 19285 1727203945.76224: done checking for max_fail_percentage 19285 1727203945.76229: checking to see if all hosts have failed and the running result is not ok 19285 1727203945.76230: done checking to see if all hosts have failed 19285 1727203945.76231: getting the remaining hosts for this loop 19285 1727203945.76232: done getting the remaining hosts for this loop 19285 1727203945.76235: getting the next task for host managed-node2 19285 1727203945.76243: done getting next task for host managed-node2 19285 1727203945.76245: ^ task is: TASK: meta (flush_handlers) 19285 1727203945.76250: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203945.76255: getting variables 19285 1727203945.76256: in VariableManager get_vars() 19285 1727203945.76352: Calling all_inventory to load vars for managed-node2 19285 1727203945.76355: Calling groups_inventory to load vars for managed-node2 19285 1727203945.76358: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203945.76371: Calling all_plugins_play to load vars for managed-node2 19285 1727203945.76374: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203945.76379: Calling groups_plugins_play to load vars for managed-node2 19285 1727203945.78440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203945.81215: done with get_vars() 19285 1727203945.81248: done getting variables 19285 1727203945.81491: in VariableManager get_vars() 19285 1727203945.81502: Calling all_inventory to load vars for managed-node2 19285 1727203945.81505: Calling groups_inventory to load vars for managed-node2 19285 1727203945.81507: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203945.81513: Calling all_plugins_play to load vars for managed-node2 19285 1727203945.81515: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203945.81518: Calling groups_plugins_play to load vars for managed-node2 19285 1727203945.83312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203945.85586: done with get_vars() 19285 1727203945.85623: done queuing things up, now waiting for results queue to drain 19285 1727203945.85626: results queue empty 19285 1727203945.85629: checking for any_errors_fatal 19285 1727203945.85636: done checking for any_errors_fatal 19285 1727203945.85637: checking for max_fail_percentage 19285 1727203945.85638: done checking for max_fail_percentage 19285 1727203945.85639: checking to see if all hosts have failed and the running result is not ok 19285 1727203945.85644: done checking to see if all hosts have failed 19285 1727203945.85645: getting the remaining hosts for this loop 19285 1727203945.85646: done getting the remaining hosts for this loop 19285 1727203945.85649: getting the next task for host managed-node2 19285 1727203945.85653: done getting next task for host managed-node2 19285 1727203945.85655: ^ task is: TASK: meta (flush_handlers) 19285 1727203945.85657: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203945.85662: getting variables 19285 1727203945.85663: in VariableManager get_vars() 19285 1727203945.85673: Calling all_inventory to load vars for managed-node2 19285 1727203945.85707: Calling groups_inventory to load vars for managed-node2 19285 1727203945.85711: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203945.85717: Calling all_plugins_play to load vars for managed-node2 19285 1727203945.85719: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203945.85722: Calling groups_plugins_play to load vars for managed-node2 19285 1727203945.88143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203945.89845: done with get_vars() 19285 1727203945.89872: done getting variables 19285 1727203945.89924: in VariableManager get_vars() 19285 1727203945.89933: Calling all_inventory to load vars for managed-node2 19285 1727203945.89940: Calling groups_inventory to load vars for managed-node2 19285 1727203945.89943: Calling all_plugins_inventory to load vars for managed-node2 19285 1727203945.89952: Calling all_plugins_play to load vars for managed-node2 19285 1727203945.89955: Calling groups_plugins_inventory to load vars for managed-node2 19285 1727203945.89958: Calling groups_plugins_play to load vars for managed-node2 19285 1727203945.91146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19285 1727203945.92414: done with get_vars() 19285 1727203945.92435: done queuing things up, now waiting for results queue to drain 19285 1727203945.92437: results queue empty 19285 1727203945.92437: checking for any_errors_fatal 19285 1727203945.92438: done checking for any_errors_fatal 19285 1727203945.92439: checking for max_fail_percentage 19285 1727203945.92439: done checking for max_fail_percentage 19285 1727203945.92440: checking to see if all hosts have failed and the running result is not ok 19285 1727203945.92440: done checking to see if all hosts have failed 19285 1727203945.92441: getting the remaining hosts for this loop 19285 1727203945.92441: done getting the remaining hosts for this loop 19285 1727203945.92444: getting the next task for host managed-node2 19285 1727203945.92446: done getting next task for host managed-node2 19285 1727203945.92446: ^ task is: None 19285 1727203945.92447: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19285 1727203945.92448: done queuing things up, now waiting for results queue to drain 19285 1727203945.92449: results queue empty 19285 1727203945.92449: checking for any_errors_fatal 19285 1727203945.92449: done checking for any_errors_fatal 19285 1727203945.92450: checking for max_fail_percentage 19285 1727203945.92450: done checking for max_fail_percentage 19285 1727203945.92451: checking to see if all hosts have failed and the running result is not ok 19285 1727203945.92451: done checking to see if all hosts have failed 19285 1727203945.92452: getting the next task for host managed-node2 19285 1727203945.92453: done getting next task for host managed-node2 19285 1727203945.92454: ^ task is: None 19285 1727203945.92455: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node2 : ok=81 changed=3 unreachable=0 failed=0 skipped=72 rescued=0 ignored=2 Tuesday 24 September 2024 14:52:25 -0400 (0:00:00.796) 0:00:44.999 ***** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.22s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.98s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.96s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.85s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:6 Gathering Facts --------------------------------------------------------- 1.27s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Gathering Facts --------------------------------------------------------- 1.20s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Gathering Facts --------------------------------------------------------- 1.18s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Gathering Facts --------------------------------------------------------- 1.13s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:3 Gathering Facts --------------------------------------------------------- 1.12s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:64 Gathering Facts --------------------------------------------------------- 1.11s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 fedora.linux_system_roles.network : Check which packages are installed --- 1.09s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.03s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 fedora.linux_system_roles.network : Re-test connectivity ---------------- 1.01s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.98s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.98s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.97s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 0.97s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Gathering Facts --------------------------------------------------------- 0.90s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:17 Gathering Facts --------------------------------------------------------- 0.89s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Check if system is ostree ----------------------------------------------- 0.86s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 19285 1727203945.92544: RUNNING CLEANUP